Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.mutagent.io/llms.txt

Use this file to discover all available pages before exploring further.

Anthropic Integration (Python)

The mutagent-anthropic package provides zero-change tracing for the official Anthropic Python SDK. Wrap your client once; every messages.create call is automatically traced.

Installation

This package is coming soon to PyPI. The install command below will work once published.
pip install mutagent-anthropic
This installs mutagent-anthropic along with its dependencies. Tracing transport is provided by mutagent-sdk via the mutagent.tracing module. The anthropic SDK (>= 0.40.0) is also required and installed automatically.

Quick Start

1

Initialize tracing

from mutagent.tracing import init_tracing

init_tracing(api_key="mt_xxxxxxxxxxxx")
2

Wrap your Anthropic client

from anthropic import Anthropic
from mutagent_anthropic import wrap_anthropic

client = wrap_anthropic(Anthropic())
3

Use the client normally

msg = client.messages.create(
    model="claude-opus-4-5",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}],
)
print(msg.content[0].text)
Every messages.create call is automatically traced and sent to MutagenT. No additional code changes required.

Full Example

import os
from anthropic import Anthropic
from mutagent_anthropic import wrap_anthropic
import asyncio
from mutagent.tracing import init_tracing, shutdown_tracing

# Initialize MutagenT tracing
init_tracing(api_key=os.environ["MUTAGENT_API_KEY"])

# Wrap once — all subsequent calls are traced
client = wrap_anthropic(Anthropic())

msg = client.messages.create(
    model="claude-opus-4-5",
    max_tokens=1024,
    messages=[
        {"role": "user", "content": "Explain observability for AI applications in 3 sentences."},
    ],
)
print(msg.content[0].text)

# Flush remaining spans on exit (shutdown_tracing is async)
asyncio.run(shutdown_tracing())

Async Client

from anthropic import AsyncAnthropic
from mutagent_anthropic import wrap_anthropic
from mutagent.tracing import init_tracing

init_tracing(api_key="mt_xxxxxxxxxxxx")

client = wrap_anthropic(AsyncAnthropic())

msg = await client.messages.create(
    model="claude-opus-4-5",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}],
)

Optional: Session and Tags

Pass metadata to every span emitted by a client instance:
client = wrap_anthropic(
    Anthropic(),
    generation_name="my-pipeline-call",
    session_id="user-session-abc",
    tags=["prod", "v2"],
)

What Gets Traced

Each non-streaming messages.create call emits one llm.chat span with:
FieldSource
input.messagesmessages param + system param (as role=system)
output.textFirst text block in response
metrics.modelmodel from response
metrics.provider"anthropic"
metrics.input_tokensusage.input_tokens
metrics.output_tokensusage.output_tokens
metrics.total_tokensinput_tokens + output_tokens
attributes.session_idsession_id kwarg (if set)
attributes.tagstags kwarg (if set)

wrap_anthropic API

wrap_anthropic(client, *, generation_name=None, session_id=None, tags=None)
Patches client.messages.create in-place and returns the same client instance.
ArgTypeDefaultDescription
clientAnthropic | AsyncAnthropicrequiredClient to patch
generation_namestr | None"anthropic.messages.create"Span name override
session_idstr | NoneNoneSession ID stored in span attributes
tagslist[str] | NoneNoneTags stored in span attributes

Limitations

  • Streaming (stream=True) is not traced in v0.1.0. Streaming requests pass through transparently. Tracking as a follow-up.

TypeScript Equivalent

MutagenT does not currently ship an Anthropic integration for TypeScript. For TypeScript LLM tracing, see OpenAI (TypeScript) or Vercel AI SDK.

Python Integrations

Overview of all Python integration packages

Python Tracing

Low-level tracing API and @trace decorator

OpenAI (Python)

Trace OpenAI calls in Python

API Reference

REST API documentation