Loading source
Pulling the file list, source metadata, and syntax-aware rendering for this listing.
Source from repo
Guidance for building UIs with Nuxt UI, the official Tailwind-based component library for Nuxt.
Files
Skill
Size
Entrypoint
Format
Open file
Syntax-highlighted preview of this file as included in the skill package.
references/layouts/chat.md
1# Chat Layout23Build AI chat interfaces with message streams, reasoning, tool calling, and Vercel AI SDK integration.45## When to use67- AI chatbot interfaces8- Customer support chat9- Any conversational UI with streaming responses1011## Setup1213### Install dependencies1415**Nuxt:**1617```bash18pnpm add ai @ai-sdk/gateway @ai-sdk/vue @comark/nuxt19```2021**Vue (Vite):**2223```bash24pnpm add ai @ai-sdk/gateway @ai-sdk/vue @comark/vue25```2627### Register Comark module2829**Nuxt:**3031```ts [nuxt.config.ts]32export default defineNuxtConfig({33modules: [34'@nuxt/ui',35'@comark/nuxt'36]37})38```3940**Vue (Vite):** No module registration needed, import directly from `@comark/vue`.4142> `@comark/nuxt` (or `@comark/vue` for Vue projects) provides the `Comark` component used to render AI responses as streaming Markdown, it incrementally renders tokens as they arrive and automatically enables Nuxt UI's prose components.4344### Dark mode for syntax highlighting4546When using the `highlight` plugin, add the following CSS to your stylesheet:4748```css [main.css]49html.dark .shiki span {50color: var(--shiki-dark) !important;51background-color: var(--shiki-dark-bg) !important;52font-style: var(--shiki-dark-font-style) !important;53font-weight: var(--shiki-dark-font-weight) !important;54text-decoration: var(--shiki-dark-text-decoration) !important;55}56```5758### Server endpoint5960Using [Vercel AI Gateway](https://vercel.com/ai-gateway) (recommended):6162```ts [server/api/chat.post.ts]63import { streamText, convertToModelMessages } from 'ai'64import { gateway } from '@ai-sdk/gateway'6566export default defineEventHandler(async (event) => {67const { messages } = await readBody(event)6869return streamText({70model: gateway('anthropic/claude-sonnet-4.6'),71system: 'You are a helpful assistant.',72messages: await convertToModelMessages(messages)73}).toUIMessageStreamResponse()74})75```7677Or with a direct provider (e.g., `pnpm add @ai-sdk/openai`):7879```ts [server/api/chat.post.ts]80import { streamText, convertToModelMessages } from 'ai'81import { openai } from '@ai-sdk/openai'8283export default defineEventHandler(async (event) => {84const { messages } = await readBody(event)8586return streamText({87model: openai('gpt-5-nano'),88system: 'You are a helpful assistant.',89messages: await convertToModelMessages(messages)90}).toUIMessageStreamResponse()91})92```9394## Component tree9596```97UDashboardPanel98├── #header → UDashboardNavbar99├── #body → UContainer → UChatMessages100│ ├── #content → UChatReasoning, UChatTool, Comark101│ └── #indicator (loading)102└── #footer → UContainer → UChatPrompt103└── UChatPromptSubmit104```105106## Full page chat107108```vue [pages/chat/[id].vue]109<script setup lang="ts">110import { isReasoningUIPart, isTextUIPart, isToolUIPart, getToolName } from 'ai'111import { Chat } from '@ai-sdk/vue'112import { isPartStreaming, isToolStreaming } from '@nuxt/ui/utils/ai'113import highlight from '@comark/nuxt/plugins/highlight'114115definePageMeta({ layout: 'dashboard' })116117const input = ref('')118119const chat = new Chat({120onError(error) {121console.error(error)122}123})124125function onSubmit() {126if (!input.value.trim()) return127chat.sendMessage({ text: input.value })128input.value = ''129}130</script>131132<template>133<UDashboardPanel>134<template #header>135<UDashboardNavbar title="Chat" />136</template>137138<template #body>139<UContainer>140<UChatMessages :messages="chat.messages" :status="chat.status">141<template #content="{ message }">142<template v-for="(part, index) in message.parts" :key="`${message.id}-${part.type}-${index}`">143<UChatReasoning144v-if="isReasoningUIPart(part)"145:text="part.text"146:streaming="isPartStreaming(part)"147>148<Comark149:markdown="part.text"150:streaming="isPartStreaming(part)"151:plugins="[highlight()]"152class="*:first:mt-0 *:last:mb-0"153/>154</UChatReasoning>155156<UChatTool157v-else-if="isToolUIPart(part)"158:text="getToolName(part)"159:streaming="isToolStreaming(part)"160/>161162<template v-else-if="isTextUIPart(part)">163<Comark164v-if="message.role === 'assistant'"165:markdown="part.text"166:streaming="isPartStreaming(part)"167:plugins="[highlight()]"168class="*:first:mt-0 *:last:mb-0"169/>170<p v-else-if="message.role === 'user'" class="whitespace-pre-wrap">171{{ part.text }}172</p>173</template>174</template>175</template>176</UChatMessages>177</UContainer>178</template>179180<template #footer>181<UContainer class="pb-4 sm:pb-6">182<UChatPrompt v-model="input" :error="chat.error" @submit="onSubmit">183<UChatPromptSubmit :status="chat.status" @stop="chat.stop()" @reload="chat.regenerate()" />184</UChatPrompt>185</UContainer>186</template>187</UDashboardPanel>188</template>189```190191## Key components192193- `UChatMessages` — scrollable message list with auto-scroll. Props: `messages`, `status`. Slots: `#content` (per message), `#actions`, `#indicator`.194- `UChatMessage` — individual bubble. Props: `message`, `side` (`'left'`/`'right'`).195- `UChatReasoning` — collapsible reasoning block. Auto-opens during streaming, auto-closes when done. Use `isPartStreaming(part)` from `@nuxt/ui/utils/ai`.196- `UChatTool` — tool invocation status. Use `isToolStreaming(part)`. Variants: `'inline'` (default), `'card'`.197- `UChatPrompt` — enhanced textarea. Accepts all Textarea props + `error` prop.198- `UChatPromptSubmit` — submit button with automatic status handling (send/stop/reload).199- `UChatPalette` — layout wrapper for chat inside overlays.200201## Chat in a modal202203```vue204<UModal v-model:open="isOpen">205<template #content>206<UChatPalette>207<UChatMessages :messages="chat.messages" :status="chat.status" />208209<template #prompt>210<UChatPrompt v-model="input" @submit="onSubmit">211<UChatPromptSubmit :status="chat.status" />212</UChatPrompt>213</template>214</UChatPalette>215</template>216</UModal>217```218219## With model selector220221```vue222<UChatPrompt v-model="input" @submit="onSubmit">223<UChatPromptSubmit :status="chat.status" />224225<template #footer>226<USelect227v-model="model"228:icon="models.find(m => m.value === model)?.icon"229placeholder="Select a model"230variant="ghost"231:items="models"232/>233</template>234</UChatPrompt>235```236237## Conversation sidebar238239Combine with dashboard layout for a ChatGPT-like interface:240241```vue [layouts/dashboard.vue]242<template>243<UDashboardGroup>244<UDashboardSidebar collapsible resizable>245<template #header>246<UButton icon="i-lucide-plus" label="New chat" block />247</template>248249<template #default>250<UNavigationMenu :items="conversations" orientation="vertical" />251</template>252</UDashboardSidebar>253254<slot />255</UDashboardGroup>256</template>257```258