Skip to main content

使用助手界面

🌐 Using Assistant UI

Assistant UI 是用于 AI 聊天的 TypeScript/React 库。基于 shadcn/ui 和 Tailwind CSS 构建,它使开发者能够在几分钟内创建漂亮的企业级聊天体验。

info

对于 Mastra 直接在你的 Next.js API 路由中运行的全栈集成方法,请参阅 Assistant UI 文档网站上的 全栈集成指南

🌐 For a full-stack integration approach where Mastra runs directly in your Next.js API routes, see the Full-Stack Integration Guide on Assistant UI's documentation site.

tip

访问 Mastra 的 “UI Dojo” 查看与 Mastra 集成的助手界面的实际示例。

🌐 Visit Mastra's "UI Dojo" to see real-world examples of Assistant UI integrated with Mastra.

集成指南
Direct link to 集成指南

🌐 Integration Guide

将 Mastra 作为独立服务器运行,并将你的 Next.js 前端(带有 Assistant UI)连接到其 API 端点。

🌐 Run Mastra as a standalone server and connect your Next.js frontend (with Assistant UI) to its API endpoints.

  1. 设置你的目录结构。可能的目录结构如下所示:

    🌐 Set up your directory structure. A possible directory structure could look like this:

    project-root
    ├── mastra-server
    │ ├── src
    │ │ └── mastra
    │ └── package.json
    └── my-app
    └── package.json

    启动你的 Mastra 服务器:

    🌐 Bootstrap your Mastra server:

    npx create-mastra@latest

    此命令将启动一个交互式向导,帮助你创建一个新的 Mastra 项目,包括提示你输入项目名称并设置基本配置。按照提示操作即可创建你的服务器项目。

    🌐 This command will launch an interactive wizard to help you scaffold a new Mastra project, including prompting you for a project name and setting up basic configurations. Follow the prompts to create your server project.

    导航到你新创建的 Mastra 服务器目录:

    🌐 Navigate to your newly created Mastra server directory:

    cd mastra-server # Replace with the actual directory name you provided

    你现在已经有一个基础的 Mastra 服务器项目准备好了。你应该有以下文件和文件夹:

    🌐 You now have a basic Mastra server project ready. You should have the following files and folders:

    src
    └── mastra
    ├── agents
    │ └── weather-agent.ts
    ├── scorers
    │ └── weather-scorer.ts
    ├── tools
    │ └── weather-tool.ts
    ├── workflows
    │ └── weather-workflow.ts
    └── index.ts
    note

    确保你已经在 .env 文件中为你的 LLM 提供商设置了适当的环境变量。

    🌐 Ensure that you have set the appropriate environment variables for your LLM provider in the .env file.

  2. 使用 @mastra/ai-sdk 中的 chatRoute() 助手为 Assistant UI 前端创建一个聊天路由。将其添加到你的 Mastra 项目中:

    🌐 Create a chat route for the Assistant UI frontend by using the chatRoute() helper from @mastra/ai-sdk. Add it to your Mastra project:

    npm install @mastra/ai-sdk@latest

    在你的 src/mastra/index.ts 文件中,注册聊天路由:

    🌐 In your src/mastra/index.ts file, register the chat route:

    src/mastra/index.ts
    import { Mastra } from '@mastra/core/mastra';
    import { chatRoute } from '@mastra/ai-sdk';
    // Rest of the imports...

    export const mastra = new Mastra({
    // Rest of the configuration...
    server: {
    apiRoutes: [
    chatRoute({
    path: '/chat/:agentId'
    })
    ]
    }
    });

    这将使所有代理以与 AI SDK 兼容的格式可用,包括位于端点 /chat/weatherAgentweatherAgent

    🌐 This will make all agents available in AI SDK-compatible formats, including the weatherAgent at the endpoint /chat/weatherAgent.

  3. 使用以下命令运行 Mastra 服务器:

    🌐 Run the Mastra server using the following command:

    npm run dev

    默认情况下,Mastra 服务器将在 http://localhost:4111 上运行。请保持该服务器运行,以便在接下来的步骤中我们设置 Assistant UI 前端连接到它。

    🌐 By default, the Mastra server will run on http://localhost:4111. Keep this server running for the next steps where we'll set up the Assistant UI frontend to connect to it.

  4. 返回上一级目录到你的项目根目录。

    🌐 Go up one directory to your project root.

    cd ..

    使用以下命令创建一个新的 assistant-ui 项目。

    🌐 Create a new assistant-ui project with the following command.

    npx assistant-ui@latest create
    note

    有关详细的设置说明,包括添加 API 密钥、基本配置和手动设置步骤,请参阅 assistant-ui 官方文档

    🌐 For detailed setup instructions, including adding API keys, basic configuration, and manual setup steps, please refer to assistant-ui's official documentation.

  5. 默认的助手用户界面设置将聊天运行时配置为在 Next.js 项目中使用本地 API 路径(/api/chat)。由于我们的 Mastra 代理运行在独立服务器上,我们需要更新前端以指向该服务器的端点。

    🌐 The default Assistant UI setup configures the chat runtime to use a local API route (/api/chat) within the Next.js project. Since our Mastra agent is running on a separate server, we need to update the frontend to point to that server's endpoint.

    在你的 assistant-ui 前端项目中打开包含 useChatRuntime 钩子的文件(通常是 app/assistant.tsxsrc/app/assistant.tsx)。找到 useChatRuntime 钩子,并将 api 属性更改为你 Mastra 代理的流端点的完整 URL:

    🌐 Open the file in your assistant-ui frontend project that contains the useChatRuntime hook (usually app/assistant.tsx or src/app/assistant.tsx). Find the useChatRuntime hook and change the api property to the full URL of your Mastra agent's stream endpoint:

    app/assistant.tsx
    "use client";

    // Rest of the imports...

    export const Assistant = () => {
    const runtime = useChatRuntime({
    transport: new AssistantChatTransport({
    api: "http://localhost:4111/chat/weatherAgent",
    }),
    });

    // Rest of the component...
    };

    现在,助手 UI 前端将直接向你正在运行的 Mastra 服务器发送聊天请求。

    🌐 Now, the Assistant UI frontend will send chat requests directly to your running Mastra server.

  6. 你已经准备好连接各个部分了!确保 Mastra 服务器和 Assistant UI 前端都在运行。启动 Next.js 开发服务器:

    🌐 You're ready to connect the pieces! Make sure both the Mastra server and the Assistant UI frontend are running. Start the Next.js development server:

    npm run dev

    你现在应该可以在浏览器中与你的代理聊天了。

    🌐 You should now be able to chat with your agent in the browser.

恭喜!你已成功通过独立服务器的方式将 Mastra 与 Assistant UI 集成。你的 Assistant UI 前端现在可以与独立的 Mastra 代理服务器通信。

🌐 Congratulations! You have successfully integrated Mastra with Assistant UI using a separate server approach. Your Assistant UI frontend now communicates with a standalone Mastra agent server.