handleChatStream()
用于 AI SDK 兼容格式中流式代理聊天的框架无关处理器。当你需要在 Hono 或 Mastra 自有的 apiRoutes 功能之外处理聊天流时,可以直接使用此函数。
🌐 Framework-agnostic handler for streaming agent chat in AI SDK-compatible format. Use this function directly when you need to handle chat streaming outside Hono or Mastra's own apiRoutes feature.
handleChatStream() 返回一个 ReadableStream,你可以用 createUIMessageStreamResponse() 封装它。
如果你想在 Mastra 服务器中创建一个聊天路由,请使用 chatRoute()。
🌐 Use chatRoute() if you want to create a chat route inside a Mastra server.
使用示例Direct link to 使用示例
🌐 Usage example
Next.js 应用路由示例:
🌐 Next.js App Router example:
app/api/chat/route.ts
import { handleChatStream } from '@mastra/ai-sdk';
import { createUIMessageStreamResponse } from 'ai';
import { mastra } from '@/src/mastra';
export async function POST(req: Request) {
const params = await req.json();
const stream = await handleChatStream({
mastra,
agentId: 'weatherAgent',
params,
});
return createUIMessageStreamResponse({ stream });
}
参数Direct link to 参数
🌐 Parameters
mastra:
Mastra
The Mastra instance containing registered agents.
agentId:
string
The ID of the agent to use for chat.
params:
ChatStreamHandlerParams
Parameters for the chat stream, including messages and optional resume data.
params.messages:
UIMessage[]
Array of messages in the conversation.
params.resumeData?:
Record<string, any>
Data for resuming a suspended agent execution. Requires `runId` to be set.
params.runId?:
string
The run ID. Required when `resumeData` is provided.
params.requestContext?:
RequestContext
Request context to pass to the agent execution.
defaultOptions?:
AgentExecutionOptions
Default options passed to agent execution. These are merged with params, with params taking precedence.
sendStart?:
boolean
= true
Whether to send start events in the stream.
sendFinish?:
boolean
= true
Whether to send finish events in the stream.
sendReasoning?:
boolean
= false
Whether to include reasoning steps in the stream.
sendSources?:
boolean
= false
Whether to include source citations in the stream.