Chat
Nuxt UI provides a set of components designed to build AI-powered chat interfaces. They integrate seamlessly with the Vercel AI SDK for streaming responses, reasoning, tool calling, and more.
Components
| Component | Description |
|---|---|
| ChatMessages | Scrollable message list with auto-scroll and loading indicator. |
| ChatMessage | Individual message bubble with avatar, actions, and slots. |
| ChatPrompt | Enhanced textarea for submitting prompts. |
| ChatPromptSubmit | Submit button with automatic status handling. |
| ChatReasoning | Collapsible block for AI reasoning / thinking process. |
| ChatTool | Collapsible block for AI tool invocation status. |
| ChatShimmer | Text shimmer animation for streaming states. |
| ChatPalette | Layout wrapper for embedding chat in modals or drawers. |
Installation
The Chat components are designed to be used with the Vercel AI SDK, specifically the Chat class for managing chat state and streaming responses.
Install the required dependencies:
pnpm add ai @ai-sdk/gateway @ai-sdk/vue
yarn add ai @ai-sdk/gateway @ai-sdk/vue
npm install ai @ai-sdk/gateway @ai-sdk/vue
bun add ai @ai-sdk/gateway @ai-sdk/vue
Server Setup
Create a server API endpoint to handle chat requests using streamText. You can use the Vercel AI Gateway to access AI models through a centralized endpoint:
import { streamText, convertToModelMessages } from 'ai'
import { gateway } from '@ai-sdk/gateway'
export default defineEventHandler(async (event) => {
const { messages } = await readBody(event)
return streamText({
model: gateway('anthropic/claude-sonnet-4.6'),
maxOutputTokens: 10000,
system: 'You are a helpful assistant.',
messages: await convertToModelMessages(messages)
}).toUIMessageStreamResponse()
})
Reasoning
To enable reasoning, configure providerOptions for your provider (Anthropic, Google, OpenAI):
import { streamText, convertToModelMessages } from 'ai'
import { gateway } from '@ai-sdk/gateway'
export default defineEventHandler(async (event) => {
const { messages } = await readBody(event)
return streamText({
model: gateway('anthropic/claude-sonnet-4.6'),
maxOutputTokens: 10000,
system: 'You are a helpful assistant.',
messages: await convertToModelMessages(messages),
providerOptions: {
anthropic: {
thinking: {
type: 'adaptive'
},
effort: 'low'
},
google: {
thinkingConfig: {
includeThoughts: true,
thinkingLevel: 'low'
}
},
openai: {
reasoningEffort: 'low',
reasoningSummary: 'detailed'
}
}
}).toUIMessageStreamResponse()
})
Web Search
Some providers offer built-in web search tools: Anthropic, Google, OpenAI.
import { streamText, convertToModelMessages } from 'ai'
import { anthropic } from '@ai-sdk/anthropic'
import { gateway } from '@ai-sdk/gateway'
export default defineEventHandler(async (event) => {
const { messages } = await readBody(event)
return streamText({
model: gateway('anthropic/claude-sonnet-4.6'),
system: 'You are a helpful assistant.',
messages: await convertToModelMessages(messages),
tools: {
web_search: anthropic.tools.webSearch_20250305({})
}
}).toUIMessageStreamResponse()
})
import { streamText, convertToModelMessages } from 'ai'
import { google } from '@ai-sdk/google'
import { gateway } from '@ai-sdk/gateway'
export default defineEventHandler(async (event) => {
const { messages } = await readBody(event)
return streamText({
model: gateway('google/gemini-3-flash'),
system: 'You are a helpful assistant.',
messages: await convertToModelMessages(messages),
tools: {
google_search: google.tools.googleSearch({})
}
}).toUIMessageStreamResponse()
})
import { streamText, convertToModelMessages } from 'ai'
import { openai } from '@ai-sdk/openai'
import { gateway } from '@ai-sdk/gateway'
export default defineEventHandler(async (event) => {
const { messages } = await readBody(event)
return streamText({
model: gateway('openai/gpt-5-nano'),
system: 'You are a helpful assistant.',
messages: await convertToModelMessages(messages),
tools: {
web_search: openai.tools.webSearch({})
}
}).toUIMessageStreamResponse()
})
Tool Calling (MCP)
You can enhance your chatbot with tool calling capabilities using the Model Context Protocol (@ai-sdk/mcp). This allows the AI to search your documentation or perform other actions:
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js'
import { streamText, convertToModelMessages, stepCountIs } from 'ai'
import { experimental_createMCPClient } from '@ai-sdk/mcp'
import { gateway } from '@ai-sdk/gateway'
export default defineEventHandler(async (event) => {
const { messages } = await readBody(event)
const httpTransport = new StreamableHTTPClientTransport(
new URL('https://your-app.com/mcp')
)
const httpClient = await experimental_createMCPClient({
transport: httpTransport
})
const tools = await httpClient.tools()
return streamText({
model: gateway('anthropic/claude-sonnet-4.6'),
maxOutputTokens: 10000,
system: 'You are a helpful assistant. Use your tools to search for relevant information before answering questions.',
messages: await convertToModelMessages(messages),
stopWhen: stepCountIs(6),
tools,
onFinish: async () => {
await httpClient.close()
},
onError: async (error) => {
console.error(error)
await httpClient.close()
}
}).toUIMessageStreamResponse()
})
Client Setup
Use the Chat class from @ai-sdk/vue to manage chat state and connect to your server endpoint:
<script setup lang="ts">
import type { UIMessage } from 'ai'
import { isReasoningUIPart, isTextUIPart, isToolUIPart, getToolName } from 'ai'
import { Chat } from '@ai-sdk/vue'
import { isReasoningStreaming, isToolStreaming } from '@nuxt/ui/utils/ai'
const input = ref('')
const chat = new Chat({
onError(error) {
console.error(error)
}
})
function onSubmit() {
chat.sendMessage({ text: input.value })
input.value = ''
}
</script>
<template>
<UChatMessages
:messages="chat.messages"
:status="chat.status"
>
<template #content="{ message }">
<template
v-for="(part, index) in message.parts"
:key="`${message.id}-${part.type}-${index}`"
>
<UChatReasoning
v-if="isReasoningUIPart(part)"
:text="part.text"
:streaming="isReasoningStreaming(message, index, chat)"
>
<MDC
:value="part.text"
:cache-key="`reasoning-${message.id}-${index}`"
class="*:first:mt-0 *:last:mb-0"
/>
</UChatReasoning>
<UChatTool
v-else-if="isToolUIPart(part)"
:text="getToolName(part)"
:streaming="isToolStreaming(part)"
/>
<MDC
v-else-if="isTextUIPart(part)"
:value="part.text"
:cache-key="`${message.id}-${index}`"
class="*:first:mt-0 *:last:mb-0"
/>
</template>
</template>
</UChatMessages>
<UChatPrompt
v-model="input"
:error="chat.error"
@submit="onSubmit"
>
<UChatPromptSubmit
:status="chat.status"
@stop="chat.stop()"
@reload="chat.regenerate()"
/>
</UChatPrompt>
</template>
MDC component from @nuxtjs/mdc to render messages as Markdown. As Nuxt UI provides pre-styled prose components, your content will be automatically styled.