Chat

AI SDK
Build AI chat interfaces with streaming, reasoning, and tool calling.

Nuxt UI provides a set of components designed to build AI-powered chat interfaces. They integrate seamlessly with the Vercel AI SDK for streaming responses, reasoning, tool calling, and more.

Check out the Nuxt and Vue AI Chat templates on GitHub for production-ready implementations.

Components

ComponentDescription
ChatMessagesScrollable message list with auto-scroll and loading indicator.
ChatMessageIndividual message bubble with avatar, actions, and slots.
ChatPromptEnhanced textarea for submitting prompts.
ChatPromptSubmitSubmit button with automatic status handling.
ChatReasoningCollapsible block for AI reasoning / thinking process.
ChatToolCollapsible block for AI tool invocation status.
ChatShimmerText shimmer animation for streaming states.
ChatPaletteLayout wrapper for embedding chat in modals or drawers.

Installation

The Chat components are designed to be used with the Vercel AI SDK, specifically the Chat class for managing chat state and streaming responses.

Install the required dependencies:

pnpm add ai @ai-sdk/gateway @ai-sdk/vue

Server Setup

Create a server API endpoint to handle chat requests using streamText. You can use the Vercel AI Gateway to access AI models through a centralized endpoint:

server/api/chat.post.ts
import { streamText, convertToModelMessages } from 'ai'
import { gateway } from '@ai-sdk/gateway'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)

  return streamText({
    model: gateway('anthropic/claude-sonnet-4.6'),
    maxOutputTokens: 10000,
    system: 'You are a helpful assistant.',
    messages: await convertToModelMessages(messages)
  }).toUIMessageStreamResponse()
})

Reasoning

To enable reasoning, configure providerOptions for your provider (Anthropic, Google, OpenAI):

server/api/chat.post.ts
import { streamText, convertToModelMessages } from 'ai'
import { gateway } from '@ai-sdk/gateway'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)

  return streamText({
    model: gateway('anthropic/claude-sonnet-4.6'),
    maxOutputTokens: 10000,
    system: 'You are a helpful assistant.',
    messages: await convertToModelMessages(messages),
    providerOptions: {
      anthropic: {
        thinking: {
          type: 'adaptive'
        },
        effort: 'low'
      },
      google: {
        thinkingConfig: {
          includeThoughts: true,
          thinkingLevel: 'low'
        }
      },
      openai: {
        reasoningEffort: 'low',
        reasoningSummary: 'detailed'
      }
    }
  }).toUIMessageStreamResponse()
})

Some providers offer built-in web search tools: Anthropic, Google, OpenAI.

import { streamText, convertToModelMessages } from 'ai'
import { anthropic } from '@ai-sdk/anthropic'
import { gateway } from '@ai-sdk/gateway'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)

  return streamText({
    model: gateway('anthropic/claude-sonnet-4.6'),
    system: 'You are a helpful assistant.',
    messages: await convertToModelMessages(messages),
    tools: {
      web_search: anthropic.tools.webSearch_20250305({})
    }
  }).toUIMessageStreamResponse()
})

Tool Calling (MCP)

You can enhance your chatbot with tool calling capabilities using the Model Context Protocol (@ai-sdk/mcp). This allows the AI to search your documentation or perform other actions:

server/api/chat.post.ts
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js'
import { streamText, convertToModelMessages, stepCountIs } from 'ai'
import { experimental_createMCPClient } from '@ai-sdk/mcp'
import { gateway } from '@ai-sdk/gateway'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)

  const httpTransport = new StreamableHTTPClientTransport(
    new URL('https://your-app.com/mcp')
  )
  const httpClient = await experimental_createMCPClient({
    transport: httpTransport
  })
  const tools = await httpClient.tools()

  return streamText({
    model: gateway('anthropic/claude-sonnet-4.6'),
    maxOutputTokens: 10000,
    system: 'You are a helpful assistant. Use your tools to search for relevant information before answering questions.',
    messages: await convertToModelMessages(messages),
    stopWhen: stepCountIs(6),
    tools,
    onFinish: async () => {
      await httpClient.close()
    },
    onError: async (error) => {
      console.error(error)
      await httpClient.close()
    }
  }).toUIMessageStreamResponse()
})

Client Setup

Use the Chat class from @ai-sdk/vue to manage chat state and connect to your server endpoint:

<script setup lang="ts">
import type { UIMessage } from 'ai'
import { isReasoningUIPart, isTextUIPart, isToolUIPart, getToolName } from 'ai'
import { Chat } from '@ai-sdk/vue'
import { isReasoningStreaming, isToolStreaming } from '@nuxt/ui/utils/ai'

const input = ref('')

const chat = new Chat({
  onError(error) {
    console.error(error)
  }
})

function onSubmit() {
  chat.sendMessage({ text: input.value })

  input.value = ''
}
</script>

<template>
  <UChatMessages
    :messages="chat.messages"
    :status="chat.status"
  >
    <template #content="{ message }">
      <template
        v-for="(part, index) in message.parts"
        :key="`${message.id}-${part.type}-${index}`"
      >
        <UChatReasoning
          v-if="isReasoningUIPart(part)"
          :text="part.text"
          :streaming="isReasoningStreaming(message, index, chat)"
        >
          <MDC
            :value="part.text"
            :cache-key="`reasoning-${message.id}-${index}`"
            class="*:first:mt-0 *:last:mb-0"
          />
        </UChatReasoning>

        <UChatTool
          v-else-if="isToolUIPart(part)"
          :text="getToolName(part)"
          :streaming="isToolStreaming(part)"
        />

        <MDC
          v-else-if="isTextUIPart(part)"
          :value="part.text"
          :cache-key="`${message.id}-${index}`"
          class="*:first:mt-0 *:last:mb-0"
        />
      </template>
    </template>
  </UChatMessages>

  <UChatPrompt
    v-model="input"
    :error="chat.error"
    @submit="onSubmit"
  >
    <UChatPromptSubmit
      :status="chat.status"
      @stop="chat.stop()"
      @reload="chat.regenerate()"
    />
  </UChatPrompt>
</template>
In this example, we use the MDC component from @nuxtjs/mdc to render messages as Markdown. As Nuxt UI provides pre-styled prose components, your content will be automatically styled.
Read the full Build an AI Chatbot tutorial for a step-by-step guide.