Providers
observe(client, provider, options?)
Section titled “observe(client, provider, options?)”Wraps an LLM client and returns a traced version. The returned client keeps the same API as the original.
const traced = observe(client, Provider.OpenAI);const traced = observe(client, Provider.Anthropic);const traced = observe(client, Provider.OpenRouter);traced = observe(client, Provider.OPENAI)traced = observe(client, Provider.ANTHROPIC)traced = observe(client, Provider.OPENROUTER)What gets patched:
- OpenAI and OpenRouter patch
client.chat.completions.create. - Anthropic patches
client.messages.create.
All other client methods remain untouched.
OpenAI
Section titled “OpenAI”import OpenAI from "openai";import { observe, Provider } from "@pulse/sdk";
const client = observe( new OpenAI({ apiKey: "sk-..." }), Provider.OpenAI);
const res = await client.chat.completions.create({ model: "gpt-4o", messages: [{ role: "user", content: "Hello" }],});
const stream = await client.chat.completions.create({ model: "gpt-4o", messages: [{ role: "user", content: "Hello" }], stream: true,});from openai import OpenAIfrom pulse_sdk import observe, Provider
client = observe( OpenAI(api_key="sk-..."), Provider.OPENAI)
res = client.chat.completions.create( model="gpt-4o", messages=[{"role": "user", "content": "Hello"}],)Both streaming and non-streaming calls are traced. For streams, the trace is recorded once the stream completes.
Anthropic
Section titled “Anthropic”import Anthropic from "@anthropic-ai/sdk";import { observe, Provider } from "@pulse/sdk";
const client = observe( new Anthropic({ apiKey: "sk-ant-..." }), Provider.Anthropic);
const res = await client.messages.create({ model: "claude-3-5-sonnet-20241022", max_tokens: 1024, messages: [{ role: "user", content: "Hello" }],});from anthropic import Anthropicfrom pulse_sdk import observe, Provider
client = observe( Anthropic(api_key="sk-ant-..."), Provider.ANTHROPIC)
res = client.messages.create( model="claude-3-5-sonnet-20241022", max_tokens=1024, messages=[{"role": "user", "content": "Hello"}],)Anthropic stop reasons are normalized:
end_turn -> stop, max_tokens -> length, stop_sequence -> stop, tool_use -> tool_calls.
OpenRouter
Section titled “OpenRouter”OpenRouter uses the OpenAI client library. Pass Provider.OpenRouter so Pulse stores the correct provider and OpenRouter cost fields.
import OpenAI from "openai";import { observe, Provider } from "@pulse/sdk";
const client = observe( new OpenAI({ apiKey: "sk-or-...", baseURL: "https://openrouter.ai/api/v1", }), Provider.OpenRouter);from openai import OpenAIfrom pulse_sdk import observe, Provider
client = observe( OpenAI( api_key="sk-or-...", base_url="https://openrouter.ai/api/v1", ), Provider.OPENROUTER)When OpenRouter includes a cost field in the response, Pulse uses that value directly.
Pricing
Section titled “Pricing”The SDK calculates cost automatically for known models.
| Model | Input | Output |
|---|---|---|
gpt-4o | $2.50 / 1M | $10.00 / 1M |
gpt-4o-mini | $0.15 / 1M | $0.60 / 1M |
gpt-4-turbo | $10.00 / 1M | $30.00 / 1M |
gpt-3.5-turbo | $0.50 / 1M | $1.50 / 1M |
claude-3-5-sonnet-20241022 | $3.00 / 1M | $15.00 / 1M |
claude-3-5-haiku-20241022 | $0.80 / 1M | $4.00 / 1M |
claude-3-opus-20240229 | $15.00 / 1M | $75.00 / 1M |
Model aliases such as gpt-4o-2024-11-20 resolve to base model pricing. Unknown models report null cost.
Error handling
Section titled “Error handling”If an LLM call throws, the SDK captures an error trace and re-throws the original error.
If trace sending fails (network error or service unavailable), the SDK logs a warning and continues. Tracing never breaks application behavior.