Skip to main content

Langfuse 导出器

🌐 Langfuse Exporter

Langfuse 是一个专为大语言模型(LLM)应用设计的开源可观测性平台。Langfuse 导出工具会将你的追踪信息发送到 Langfuse,提供关于模型性能、令牌使用情况以及对话流程的详细洞察。

安装
Direct link to 安装

🌐 Installation

npm install @mastra/langfuse@latest

配置
Direct link to 配置

🌐 Configuration

先决条件
Direct link to 先决条件

🌐 Prerequisites

  1. Langfuse 账户:注册请访问 cloud.langfuse.com 或部署自托管版本
  2. API 密钥:在 Langfuse 设置 → API 密钥 中创建公钥/私钥对
  3. 环境变量:设置你的凭据
.env
LANGFUSE_PUBLIC_KEY=pk-lf-xxxxxxxxxxxx
LANGFUSE_SECRET_KEY=sk-lf-xxxxxxxxxxxx
LANGFUSE_BASE_URL=https://cloud.langfuse.com # Or your self-hosted URL

零配置设置
Direct link to 零配置设置

🌐 Zero-Config Setup

设置环境变量后,使用无配置的导出器:

🌐 With environment variables set, use the exporter with no configuration:

src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { Observability } from "@mastra/observability";
import { LangfuseExporter } from "@mastra/langfuse";

export const mastra = new Mastra({
observability: new Observability({
configs: {
langfuse: {
serviceName: "my-service",
exporters: [new LangfuseExporter()],
},
},
}),
});

显式配置
Direct link to 显式配置

🌐 Explicit Configuration

你也可以直接传递凭据(优先于环境变量):

🌐 You can also pass credentials directly (takes precedence over environment variables):

src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { Observability } from "@mastra/observability";
import { LangfuseExporter } from "@mastra/langfuse";

export const mastra = new Mastra({
observability: new Observability({
configs: {
langfuse: {
serviceName: "my-service",
exporters: [
new LangfuseExporter({
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
secretKey: process.env.LANGFUSE_SECRET_KEY!,
baseUrl: process.env.LANGFUSE_BASE_URL,
options: {
environment: process.env.NODE_ENV,
},
}),
],
},
},
}),
});

配置选项
Direct link to 配置选项

🌐 Configuration Options

实时模式 vs 批处理模式
Direct link to 实时模式 vs 批处理模式

🌐 Realtime vs Batch Mode

Langfuse 导出器支持两种发送追踪的模式:

🌐 The Langfuse exporter supports two modes for sending traces:

实时模式(开发)
Direct link to 实时模式(开发)

🌐 Realtime Mode (Development)

跟踪信息会立即出现在 Langfuse 仪表板上,非常适合调试:

🌐 Traces appear immediately in Langfuse dashboard, ideal for debugging:

new LangfuseExporter({
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
secretKey: process.env.LANGFUSE_SECRET_KEY!,
realtime: true, // Flush after each event
});

批处理模式(生产)
Direct link to 批处理模式(生产)

🌐 Batch Mode (Production)

通过自动批处理实现更好的性能:

🌐 Better performance with automatic batching:

new LangfuseExporter({
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
secretKey: process.env.LANGFUSE_SECRET_KEY!,
realtime: false, // Default - batch traces
});

完成配置
Direct link to 完成配置

🌐 Complete Configuration

new LangfuseExporter({
// Required credentials
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
secretKey: process.env.LANGFUSE_SECRET_KEY!,

// Optional settings
baseUrl: process.env.LANGFUSE_BASE_URL, // Default: https://cloud.langfuse.com
realtime: process.env.NODE_ENV === "development", // Dynamic mode selection
logLevel: "info", // Diagnostic logging: debug | info | warn | error

// Langfuse-specific options
options: {
environment: process.env.NODE_ENV, // Shows in UI for filtering
version: process.env.APP_VERSION, // Track different versions
release: process.env.GIT_COMMIT, // Git commit hash
},
});

提示链接
Direct link to 提示链接

🌐 Prompt Linking

你可以将大型语言模型(LLM)的生成内容与存储在 Langfuse Prompt Management 中的提示关联。这可以实现提示的版本跟踪和指标监控。

🌐 You can link LLM generations to prompts stored in Langfuse Prompt Management. This enables version tracking and metrics for your prompts.

🌐 Using the Helper (Recommended)

withLangfusePromptbuildTracingOptions 一起使用,以获得最干净的 API:

🌐 Use withLangfusePrompt with buildTracingOptions for the cleanest API:

src/agents/support-agent.ts
import { Agent } from "@mastra/core/agent";
import { buildTracingOptions } from "@mastra/observability";
import { withLangfusePrompt } from "@mastra/langfuse";
import { Langfuse } from "langfuse";

// Reads credentials from LANGFUSE_SECRET_KEY, LANGFUSE_PUBLIC_KEY, LANGFUSE_BASE_URL env vars
const langfuse = new Langfuse();

// Fetch the prompt from Langfuse Prompt Management
const prompt = await langfuse.getPrompt("customer-support");

export const supportAgent = new Agent({
name: "support-agent",
instructions: prompt.prompt, // Use the prompt text from Langfuse
model: "openai/gpt-4o",
defaultGenerateOptions: {
tracingOptions: buildTracingOptions(withLangfusePrompt(prompt)),
},
});

withLangfusePrompt 工具会自动从 Langfuse 提示对象中提取 nameversionid

🌐 The withLangfusePrompt helper automatically extracts name, version, and id from the Langfuse prompt object.

手动字段
Direct link to 手动字段

🌐 Manual Fields

如果你没有使用 Langfuse SDK,你也可以传递手动字段:

🌐 You can also pass manual fields if you're not using the Langfuse SDK:

const tracingOptions = buildTracingOptions(
withLangfusePrompt({ name: "my-prompt", version: 1 }),
);

// Or with just an ID
const tracingOptions = buildTracingOptions(
withLangfusePrompt({ id: "prompt-uuid-12345" }),
);

提示对象字段
Direct link to 提示对象字段

🌐 Prompt Object Fields

提示对象支持以下字段:

🌐 The prompt object supports these fields:

字段类型描述
name字符串Langfuse 中的提示名称
version数字提示的版本号
id字符串用于直接链接的提示 UUID

你可以使用以下任一方式链接提示:

🌐 You can link prompts using either:

  • id 本身(UUID 唯一标识一个提示版本)
  • nameversion 一起
  • 所有三个字段

当设置在 MODEL_GENERATION 范围时,Langfuse 导出器会自动将生成内容链接到相应的提示。

🌐 When set on a MODEL_GENERATION span, the Langfuse exporter automatically links the generation to the corresponding prompt.

🌐 Related