LangfuseExporter
将跟踪数据发送到 Langfuse 以实现可观测性。
🌐 Sends Tracing data to Langfuse for observability.
构造函数Direct link to 构造函数
🌐 Constructor
new LangfuseExporter(config: LangfuseExporterConfig)
LangfuseExporterConfigDirect link to LangfuseExporterConfig
interface LangfuseExporterConfig extends BaseExporterConfig {
publicKey?: string;
secretKey?: string;
baseUrl?: string;
realtime?: boolean;
options?: any;
}
扩展自 BaseExporterConfig,包括:
🌐 Extends BaseExporterConfig, which includes:
logger?: IMastraLogger- 记录器实例logLevel?: LogLevel | 'debug' | 'info' | 'warn' | 'error'- 日志级别(默认:INFO)
方法Direct link to 方法
🌐 Methods
exportTracingEventDirect link to exportTracingEvent
async exportTracingEvent(event: TracingEvent): Promise<void>
将跟踪事件导出到 Langfuse。
🌐 Exports a tracing event to Langfuse.
exportDirect link to export
async export(spans: ReadOnlySpan[]): Promise<void>
批量导出跨度到 Langfuse。
🌐 Batch exports spans to Langfuse.
flushDirect link to flush
async flush(): Promise<void>
强制将任何缓冲的跨度刷新到 Langfuse,而不关闭导出器。在无服务器环境中非常有用,可以确保在运行时终止之前导出跨度。
🌐 Force flushes any buffered spans to Langfuse without shutting down the exporter. Useful in serverless environments where you need to ensure spans are exported before the runtime terminates.
shutdownDirect link to shutdown
async shutdown(): Promise<void>
刷新待处理的数据并关闭客户端。
🌐 Flushes pending data and shuts down the client.
用法Direct link to 用法
🌐 Usage
零配置(使用环境变量)Direct link to 零配置(使用环境变量)
🌐 Zero-Config (using environment variables)
import { LangfuseExporter } from "@mastra/langfuse";
// Reads from LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGFUSE_BASE_URL
const exporter = new LangfuseExporter();
显式配置Direct link to 显式配置
🌐 Explicit Configuration
import { LangfuseExporter } from "@mastra/langfuse";
const exporter = new LangfuseExporter({
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
secretKey: process.env.LANGFUSE_SECRET_KEY,
baseUrl: "https://cloud.langfuse.com",
realtime: true,
});
跨度映射Direct link to 跨度映射
🌐 Span Mapping
- 根跨度 → Langfuse 跟踪
MODEL_GENERATION跨度 → Langfuse 生成- 所有其他跨度 → Langfuse 跨度
- 事件跨度 → Langfuse 事件
提示链接Direct link to 提示链接
🌐 Prompt Linking
使用 withLangfusePrompt 辅助将 LLM 生成内容链接到 Langfuse 提示管理:
🌐 Link LLM generations to Langfuse Prompt Management using the withLangfusePrompt helper:
import { buildTracingOptions } from "@mastra/observability";
import { withLangfusePrompt } from "@mastra/langfuse";
import { Langfuse } from "langfuse";
const langfuse = new Langfuse({
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
secretKey: process.env.LANGFUSE_SECRET_KEY!,
});
const prompt = await langfuse.getPrompt("customer-support");
const agent = new Agent({
name: "support-agent",
instructions: prompt.prompt,
model: "openai/gpt-4o",
defaultGenerateOptions: {
tracingOptions: buildTracingOptions(withLangfusePrompt(prompt)),
},
});
辅助函数Direct link to 辅助函数
🌐 Helper Functions
withLangfusePrompt(prompt)Direct link to withlangfusepromptprompt
将 Langfuse 提示元数据添加到跟踪选项中。
🌐 Adds Langfuse prompt metadata to tracing options.
// With Langfuse SDK prompt object
withLangfusePrompt(prompt)
// With manual fields
withLangfusePrompt({ name: "my-prompt", version: 1 })
withLangfusePrompt({ id: "prompt-uuid" })
当在 MODEL_GENERATION 范围上设置 metadata.langfuse.prompt(无论是单独使用 id,还是使用 name + version)时,导出器会自动将生成内容链接到 Langfuse 中的提示。
🌐 When metadata.langfuse.prompt is set on a MODEL_GENERATION span (with either id alone, or name + version), the exporter automatically links the generation to the prompt in Langfuse.