Langfuse 导出器
🌐 Langfuse Exporter
Langfuse 是一个专为大语言模型(LLM)应用设计的开源可观测性平台。Langfuse 导出工具会将你的追踪信息发送到 Langfuse,提供关于模型性能、令牌使用情况以及对话流程的详细洞察。
安装Direct link to 安装
🌐 Installation
- npm
- pnpm
- Yarn
- Bun
npm install @mastra/langfuse@latest
pnpm add @mastra/langfuse@latest
yarn add @mastra/langfuse@latest
bun add @mastra/langfuse@latest
配置Direct link to 配置
🌐 Configuration
先决条件Direct link to 先决条件
🌐 Prerequisites
- Langfuse 账户:注册请访问 cloud.langfuse.com 或部署自托管版本
- API 密钥:在 Langfuse 设置 → API 密钥 中创建公钥/私钥对
- 环境变量:设置你的凭据
LANGFUSE_PUBLIC_KEY=pk-lf-xxxxxxxxxxxx
LANGFUSE_SECRET_KEY=sk-lf-xxxxxxxxxxxx
LANGFUSE_BASE_URL=https://cloud.langfuse.com # Or your self-hosted URL
零配置设置Direct link to 零配置设置
🌐 Zero-Config Setup
设置环境变量后,使用无配置的导出器:
🌐 With environment variables set, use the exporter with no configuration:
import { Mastra } from "@mastra/core";
import { Observability } from "@mastra/observability";
import { LangfuseExporter } from "@mastra/langfuse";
export const mastra = new Mastra({
observability: new Observability({
configs: {
langfuse: {
serviceName: "my-service",
exporters: [new LangfuseExporter()],
},
},
}),
});
显式配置Direct link to 显式配置
🌐 Explicit Configuration
你也可以直接传递凭据(优先于环境变量):
🌐 You can also pass credentials directly (takes precedence over environment variables):
import { Mastra } from "@mastra/core";
import { Observability } from "@mastra/observability";
import { LangfuseExporter } from "@mastra/langfuse";
export const mastra = new Mastra({
observability: new Observability({
configs: {
langfuse: {
serviceName: "my-service",
exporters: [
new LangfuseExporter({
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
secretKey: process.env.LANGFUSE_SECRET_KEY!,
baseUrl: process.env.LANGFUSE_BASE_URL,
options: {
environment: process.env.NODE_ENV,
},
}),
],
},
},
}),
});
配置选项Direct link to 配置选项
🌐 Configuration Options
实时模式 vs 批处理模式Direct link to 实时模式 vs 批处理模式
🌐 Realtime vs Batch Mode
Langfuse 导出器支持两种发送追踪的模式:
🌐 The Langfuse exporter supports two modes for sending traces:
实时模式(开发)Direct link to 实时模式(开发)
🌐 Realtime Mode (Development)
跟踪信息会立即出现在 Langfuse 仪表板上,非常适合调试:
🌐 Traces appear immediately in Langfuse dashboard, ideal for debugging:
new LangfuseExporter({
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
secretKey: process.env.LANGFUSE_SECRET_KEY!,
realtime: true, // Flush after each event
});
批处理模式(生产)Direct link to 批处理模式(生产)
🌐 Batch Mode (Production)
通过自动批处理实现更好的性能:
🌐 Better performance with automatic batching:
new LangfuseExporter({
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
secretKey: process.env.LANGFUSE_SECRET_KEY!,
realtime: false, // Default - batch traces
});
完成配置Direct link to 完成配置
🌐 Complete Configuration
new LangfuseExporter({
// Required credentials
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
secretKey: process.env.LANGFUSE_SECRET_KEY!,
// Optional settings
baseUrl: process.env.LANGFUSE_BASE_URL, // Default: https://cloud.langfuse.com
realtime: process.env.NODE_ENV === "development", // Dynamic mode selection
logLevel: "info", // Diagnostic logging: debug | info | warn | error
// Langfuse-specific options
options: {
environment: process.env.NODE_ENV, // Shows in UI for filtering
version: process.env.APP_VERSION, // Track different versions
release: process.env.GIT_COMMIT, // Git commit hash
},
});
提示链接Direct link to 提示链接
🌐 Prompt Linking
你可以将大型语言模型(LLM)的生成内容与存储在 Langfuse Prompt Management 中的提示关联。这可以实现提示的版本跟踪和指标监控。
🌐 You can link LLM generations to prompts stored in Langfuse Prompt Management. This enables version tracking and metrics for your prompts.
使用助手(推荐)Direct link to 使用助手(推荐)
🌐 Using the Helper (Recommended)
将 withLangfusePrompt 与 buildTracingOptions 一起使用,以获得最干净的 API:
🌐 Use withLangfusePrompt with buildTracingOptions for the cleanest API:
import { Agent } from "@mastra/core/agent";
import { buildTracingOptions } from "@mastra/observability";
import { withLangfusePrompt } from "@mastra/langfuse";
import { Langfuse } from "langfuse";
// Reads credentials from LANGFUSE_SECRET_KEY, LANGFUSE_PUBLIC_KEY, LANGFUSE_BASE_URL env vars
const langfuse = new Langfuse();
// Fetch the prompt from Langfuse Prompt Management
const prompt = await langfuse.getPrompt("customer-support");
export const supportAgent = new Agent({
name: "support-agent",
instructions: prompt.prompt, // Use the prompt text from Langfuse
model: "openai/gpt-4o",
defaultGenerateOptions: {
tracingOptions: buildTracingOptions(withLangfusePrompt(prompt)),
},
});
withLangfusePrompt 工具会自动从 Langfuse 提示对象中提取 name、version 和 id。
🌐 The withLangfusePrompt helper automatically extracts name, version, and id from the Langfuse prompt object.
手动字段Direct link to 手动字段
🌐 Manual Fields
如果你没有使用 Langfuse SDK,你也可以传递手动字段:
🌐 You can also pass manual fields if you're not using the Langfuse SDK:
const tracingOptions = buildTracingOptions(
withLangfusePrompt({ name: "my-prompt", version: 1 }),
);
// Or with just an ID
const tracingOptions = buildTracingOptions(
withLangfusePrompt({ id: "prompt-uuid-12345" }),
);
提示对象字段Direct link to 提示对象字段
🌐 Prompt Object Fields
提示对象支持以下字段:
🌐 The prompt object supports these fields:
| 字段 | 类型 | 描述 |
|---|---|---|
name | 字符串 | Langfuse 中的提示名称 |
version | 数字 | 提示的版本号 |
id | 字符串 | 用于直接链接的提示 UUID |
你可以使用以下任一方式链接提示:
🌐 You can link prompts using either:
id本身(UUID 唯一标识一个提示版本)name和version一起- 所有三个字段
当设置在 MODEL_GENERATION 范围时,Langfuse 导出器会自动将生成内容链接到相应的提示。
🌐 When set on a MODEL_GENERATION span, the Langfuse exporter automatically links the generation to the corresponding prompt.
相关Direct link to 相关
🌐 Related