Skip to main content

LangChain

The @mutagent/langchain package provides a MutagentCallbackHandler that plugs into LangChain’s callback system. Every LLM call, chain execution, tool invocation, and retriever query is automatically captured as a trace span with zero code changes to your existing LangChain logic.

Installation

bun add @mutagent/langchain @mutagent/sdk
Peer dependencies: @mutagent/sdk >=0.1.0, @langchain/core >=1.1.8

Quick Start

1

Initialize tracing

Call initTracing() once at application startup. This configures the SDK’s span batching and transport layer.
import { initTracing } from '@mutagent/sdk/tracing';

initTracing({ apiKey: process.env.MUTAGENT_API_KEY! });
2

Create the callback handler

Instantiate MutagentCallbackHandler. Optionally pass session and user context.
import { MutagentCallbackHandler } from '@mutagent/langchain';

const handler = new MutagentCallbackHandler({
  sessionId: 'my-session',  // optional
  userId: 'user-123',       // optional
});
3

Attach to any LangChain component

Pass the handler via the callbacks option on any LangChain invocation.
import { ChatOpenAI } from '@langchain/openai';

const llm = new ChatOpenAI({ model: 'gpt-4o' });

const result = await llm.invoke('What is observability?', {
  callbacks: [handler],
});
// Trace automatically sent to MutagenT

Full Example

import { initTracing } from '@mutagent/sdk/tracing';
import { MutagentCallbackHandler } from '@mutagent/langchain';
import { ChatOpenAI } from '@langchain/openai';
import { PromptTemplate } from '@langchain/core/prompts';
import { RunnableSequence } from '@langchain/core/runnables';

// 1. Initialize SDK tracing (once at app startup)
initTracing({ apiKey: process.env.MUTAGENT_API_KEY! });

// 2. Create the callback handler
const handler = new MutagentCallbackHandler();

// 3. Build your chain as usual
const prompt = PromptTemplate.fromTemplate(
  'Explain {topic} in simple terms.'
);
const llm = new ChatOpenAI({ model: 'gpt-4o' });
const chain = RunnableSequence.from([prompt, llm]);

// 4. Invoke with the callback
const result = await chain.invoke(
  { topic: 'vector databases' },
  { callbacks: [handler] },
);

What Gets Traced

The callback handler captures the following LangChain event types:
EventSpan KindData Captured
handleLLMStart / handleChatModelStartllm.chatInput prompts or messages, model name
handleLLMEndllm.chatOutput text, token usage (prompt, completion, total)
handleChainStartchainChain name, input values
handleChainEndchainOutput values
handleToolStarttoolTool name, input string
handleToolEndtoolOutput string
handleRetrieverStartretrievalRetriever name, query string
handleRetrieverEndretrievalRetrieved documents (content + metadata)
Error handlersAnyError message and status
Parent-child relationships are preserved: when a chain invokes an LLM which calls a tool, the resulting spans form a nested tree in MutagenT.

Token Usage Tracking

Token metrics are automatically extracted from LLMResult.llmOutput.tokenUsage when available:
  • inputTokens — prompt tokens
  • outputTokens — completion tokens
  • totalTokens — combined total
Token usage availability depends on your LLM provider. OpenAI and Anthropic models report tokens; some open-source models may not.

Handler Options

MutagentCallbackHandler accepts optional configuration for session tracking, user attribution, and metadata:
const handler = new MutagentCallbackHandler({
  sessionId: 'chat-session-123',  // Group traces by session
  userId: 'user-456',             // Attribute traces to a user
  tags: ['production', 'v2'],     // Filter traces by tag
  metadata: { version: '2.0' },   // Custom key-value metadata
});
OptionTypeDescription
sessionIdstringGroup related traces into a session
userIdstringAttribute traces to a specific user
tagsstring[]Tags for filtering in the dashboard
metadataRecord<string, unknown>Custom key-value metadata
All options are optional. Without options, the handler works with zero configuration — just pass it to any LangChain component. Configure tracing behavior (batch size, flush interval, etc.) through initTracing():
initTracing({
  apiKey: process.env.MUTAGENT_API_KEY!,
  batchSize: 20,
  flushIntervalMs: 3000,
});

CLI Shortcut

Generate a complete integration scaffold with the CLI:
mutagent integrate langchain
This auto-detects LangChain in your package.json (looks for langchain or @langchain/core) and generates ready-to-use configuration code. To validate an existing integration setup:
mutagent integrate langchain --verify

Python

Looking for the Python LangChain integration? See the Python LangChain guide.