在你的 Nuxt 项目中集成 Mastra
🌐 Integrate Mastra in your Nuxt project
在本指南中,你将使用 Mastra 构建一个调用工具的 AI 代理,然后通过从服务器路由直接导入并调用该代理,将其连接到 Nuxt。
🌐 In this guide, you'll build a tool-calling AI agent using Mastra, then connect it to Nuxt by importing and calling the agent directly from your server routes.
你将使用 AI SDK UI 来用 Vue 创建一个漂亮且互动的聊天体验。
🌐 You'll use AI SDK UI to create a beautiful, interactive chat experience with Vue.
在你开始之前Direct link to 在你开始之前
🌐 Before you begin
创建一个新的 Nuxt 应用(可选)Direct link to 创建一个新的 Nuxt 应用(可选)
🌐 Create a new Nuxt app (optional)
如果你已经有一个 Nuxt 应用,可以跳到下一步。
🌐 If you already have a Nuxt app, skip to the next step.
运行以下命令来创建一个新的 Nuxt 应用:
🌐 Run the following command to create a new Nuxt app:
- npm
- pnpm
- Yarn
- Bun
npm create nuxt@latest mastra-nuxt -- --template minimal --packageManager npm --gitInit --modules
pnpm create nuxt mastra-nuxt --template minimal --packageManager npm --gitInit --modules
yarn create nuxt mastra-nuxt --template minimal --packageManager npm --gitInit --modules
bunx create-nuxt mastra-nuxt --template minimal --packageManager npm --gitInit --modules
这会创建一个名为 mastra-nuxt 的项目,但你可以用你想要的任何名字替代它。
🌐 This creates a project called mastra-nuxt, but you can replace it with any name you want.
初始化 MastraDirect link to 初始化 Mastra
🌐 Initialize Mastra
导航到你的 Nuxt 项目:
🌐 Navigate to your Nuxt project:
cd mastra-nuxt
运行 mastra init。出现提示时,选择一个提供商(例如 OpenAI)并输入你的密钥:
🌐 Run mastra init. When prompted, choose a provider (e.g. OpenAI) and enter your key:
- npm
- pnpm
- Yarn
- Bun
npx mastra@latest init
pnpm dlx mastra@latest init
yarn dlx mastra@latest init
bun x mastra@latest init
这将创建一个包含示例天气代理和以下文件的 mastra 文件夹:
🌐 This creates a mastra folder with an example weather agent and the following files:
index.ts- Mastra 配置,包括内存tools/weather-tool.ts- 一个用于获取给定位置天气的工具agents/weather-agent.ts——一个使用该工具的天气代理和提示
在接下来的步骤中,你将从 Nuxt 服务器路由调用 weather-agent.ts。
🌐 You'll call weather-agent.ts from your Nuxt server routes in the next steps.
安装 AI SDK 界面Direct link to 安装 AI SDK 界面
🌐 Install AI SDK UI
安装 AI SDK 界面以及 Mastra 适配器:
🌐 Install AI SDK UI along with the Mastra adapter:
- npm
- pnpm
- Yarn
- Bun
npm install @mastra/ai-sdk@latest @ai-sdk/vue ai
pnpm add @mastra/ai-sdk@latest @ai-sdk/vue ai
yarn add @mastra/ai-sdk@latest @ai-sdk/vue ai
bun add @mastra/ai-sdk@latest @ai-sdk/vue ai
创建聊天路线Direct link to 创建聊天路线
🌐 Create a chat route
创建 server/api/chat.ts:
🌐 Create server/api/chat.ts:
import { handleChatStream } from '@mastra/ai-sdk';
import { toAISdkV5Messages } from '@mastra/ai-sdk/ui';
import { createUIMessageStreamResponse } from 'ai';
import { mastra } from '../../src/mastra';
const THREAD_ID = 'example-user-id';
const RESOURCE_ID = 'weather-chat';
export default defineEventHandler(async (event) => {
const method = event.method;
if (method === 'POST') {
const params = await readBody(event);
const stream = await handleChatStream({
mastra,
agentId: 'weather-agent',
params: {
...params,
memory: {
...params.memory,
thread: THREAD_ID,
resource: RESOURCE_ID,
}
}
});
return createUIMessageStreamResponse({ stream });
}
if (method === 'GET') {
const memory = await mastra.getAgentById('weather-agent').getMemory();
let response = null;
try {
response = await memory?.recall({
threadId: THREAD_ID,
resourceId: RESOURCE_ID,
});
} catch {
console.log('No previous messages found.');
}
const uiMessages = toAISdkV5Messages(response?.messages || []);
return uiMessages;
}
});
POST 处理器接受一个提示并以 AI SDK 格式流式传回代理的响应,而 GET 处理器则从内存中获取消息历史,以便在客户端重新加载时可以恢复 UI。
🌐 The POST handler accepts a prompt and streams the agent's response back in AI SDK format, while the GET handler fetches message history from memory so the UI can be hydrated when the client reloads.
添加聊天界面Direct link to 添加聊天界面
🌐 Add the chat UI
将 app/app.vue 的内容替换为以下内容:
🌐 Replace the contents of app/app.vue with the following:
<script setup lang="ts">
import { ref, onMounted } from 'vue';
import { Chat } from "@ai-sdk/vue";
import { DefaultChatTransport, type ToolUIPart } from 'ai';
const chat = new Chat({
transport: new DefaultChatTransport({
api: '/api/chat',
}),
})
const STATE_TO_LABEL_MAP: Record<string, string> = {
'input-streaming': 'Pending',
'input-available': 'Running',
'output-available': 'Completed',
'output-error': 'Error',
'output-denied': 'Denied',
};
const input = ref('');
onMounted(async () => {
const res = await fetch('/api/chat');
const data = await res.json();
chat.messages = [...data];
});
function handleSubmit() {
if (!input.value.trim()) return;
chat.sendMessage({ text: input.value });
input.value = '';
}
</script>
<template>
<div class="chat-container">
<div class="messages">
<div v-for="message in chat.messages" :key="message.id" class="message-wrapper">
<div
v-for="(part, i) in message.parts"
:key="`${message.id}-${i}`"
>
<div
v-if="part.type === 'text'"
:class="['message', message.role]"
>
<div class="message-content">
{{ part.text }}
</div>
</div>
<details
v-else-if="part.type?.startsWith('tool-')"
class="tool"
>
<summary class="tool-header">
{{ (part as ToolUIPart).type?.split('-').slice(1).join('-') }} -
{{ STATE_TO_LABEL_MAP[(part as ToolUIPart).state ?? 'output-available'] }}
</summary>
<div class="tool-content">
<div class="tool-section">
<div class="tool-label">Parameters</div>
<pre><code>{{ JSON.stringify((part as ToolUIPart).input, null, 2) }}</code></pre>
</div>
<div class="tool-section">
<div class="tool-label">
{{ (part as ToolUIPart).errorText ? 'Error' : 'Result' }}
</div>
<pre><code>{{ JSON.stringify((part as ToolUIPart).output, null, 2) }}</code></pre>
<div v-if="(part as ToolUIPart).errorText" class="tool-error">
{{ (part as ToolUIPart).errorText }}
</div>
</div>
</div>
</details>
</div>
</div>
</div>
<form class="input-form" @submit.prevent="handleSubmit">
<input
v-model="input"
type="text"
placeholder="Ask about the weather..."
:disabled="chat.status !== 'ready'"
class="chat-input"
/>
<button type="submit" class="submit-button" :disabled="chat.status !== 'ready'">
Send
</button>
</form>
</div>
</template>
<style>
*, *::before, *::after {
box-sizing: border-box;
}
*:not(dialog) {
margin: 0;
}
@media (prefers-reduced-motion: no-preference) {
html {
interpolate-size: allow-keywords;
}
}
html {
font-family: -apple-system, BlinkMacSystemFont, avenir next, avenir, segoe ui, helvetica neue, Adwaita Sans, Cantarell, Ubuntu, roboto, noto, helvetica, arial, sans-serif;
}
body {
line-height: 1.5;
-webkit-font-smoothing: antialiased;
}
img, picture, video, canvas, svg {
display: block;
max-width: 100%;
}
input, button, textarea, select {
font: inherit;
}
p, h1, h2, h3, h4, h5, h6 {
overflow-wrap: break-word;
}
p {
text-wrap: pretty;
}
h1, h2, h3, h4, h5, h6 {
text-wrap: balance;
}
.chat-container {
max-width: 48rem;
margin: 0 auto;
padding: 1.5rem;
height: 100vh;
display: flex;
flex-direction: column;
}
.messages {
flex: 1;
overflow-y: auto;
display: flex;
flex-direction: column;
gap: 1rem;
}
.message-wrapper {
display: flex;
flex-direction: column;
gap: 0.5rem;
}
.message {
padding: 0.75rem 1rem;
border-radius: 0.5rem;
}
.message.user {
background-color: #3b82f6;
color: white;
margin-left: auto;
max-width: 60%;
}
.message.assistant {
background-color: #f3f4f6;
color: #1f2937;
max-width: 80%;
}
.tool {
border: 1px solid #d1d5db;
border-radius: 0.5rem;
margin: 0.5rem 0;
overflow: hidden;
}
.tool-header {
padding: 0.75rem 1rem;
background-color: #f9fafb;
cursor: pointer;
font-weight: 500;
font-size: 0.875rem;
}
.tool-content {
padding: 1rem;
border-top: 1px solid #d1d5db;
}
.tool-section {
margin-bottom: 1rem;
}
.tool-section:last-child {
margin-bottom: 0;
}
.tool-label {
font-size: 0.75rem;
font-weight: 500;
text-transform: uppercase;
color: #6b7280;
margin-bottom: 0.5rem;
}
.tool pre {
background-color: #f3f4f6;
padding: 0.75rem;
border-radius: 0.375rem;
overflow-x: auto;
font-size: 0.875rem;
}
.tool-error {
color: #dc2626;
margin-top: 0.5rem;
}
.input-form {
display: grid;
grid-template-columns: 1fr auto;
gap: 0.75rem;
padding-top: 1rem;
border-top: 1px solid #e5e7eb;
margin-top: 1rem;
}
.chat-input {
padding: 0.75rem 1rem;
border: 1px solid #d1d5db;
border-radius: 0.5rem;
font-size: 1rem;
}
.chat-input:focus {
outline: none;
border-color: #3b82f6;
box-shadow: 0 0 0 3px rgba(59, 130, 246, 0.1);
}
.chat-input:disabled {
background-color: #f3f4f6;
cursor: not-allowed;
}
.submit-button {
padding: 0.75rem 1.5rem;
background-color: #3b82f6;
color: white;
border: none;
border-radius: 0.5rem;
font-weight: 500;
cursor: pointer;
transition: background-color 0.2s;
}
.submit-button:hover:not(:disabled) {
background-color: #2563eb;
}
.submit-button:disabled {
background-color: #9ca3af;
cursor: not-allowed;
}
</style>
该组件将 Chat() 连接到 /api/chat 端点,将提示发送到那里,并分块流式返回响应。
🌐 This component connects Chat() to the /api/chat endpoint, sending prompts there and streaming the response back in chunks.
它使用自定义消息样式呈现响应文本,并在可折叠的详细信息元素中显示任何工具调用。
🌐 It renders the response text using custom message styling and shows any tool invocations in a collapsible details element.
测试你的代理Direct link to 测试你的代理
🌐 Test your agent
- 使用
npm run dev运行你的 Nuxt 应用 - 在 http://localhost:3000 打开聊天
- 试着询问天气。如果你的 API 密钥设置正确,你将会收到回复
下一步Direct link to 下一步
🌐 Next steps
祝贺你使用 Nuxt 构建了你的 Mastra 代理!🎉
🌐 Congratulations on building your Mastra agent with Nuxt! 🎉
从这里,你可以使用你自己的工具和逻辑来扩展项目:
🌐 From here, you can extend the project with your own tools and logic:
当你准备好时,阅读更多关于 Mastra 如何与 AI SDK UI 和 Nuxt 集成,以及如何在任何地方部署你的代理的信息:
🌐 When you're ready, read more about how Mastra integrates with AI SDK UI and Nuxt, and how to deploy your agent anywhere: