Skip to content

awecode/nuxt-tanstack-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tanstack AI + Nuxt UI

This layer ships a small chat stack built on @tanstack/ai, and @nuxt/ui. The root component is Chat, which wires useChat, fetchServerSentEvents, and the transcript UI.

Usage

Use Chat anywhere you want a full AI chat interface.

In your nuxt.config.ts, configure layer usage.

export default defineNuxtConfig({
  extends: [
    ['github:awecode/nuxt-tanstack-ai', { install: true }],
  ],
})
<template>
    <Chat />
</template>

Default SSE endpoint is POST /api/chat (override with the endpoint prop).

Full props example (all props shown; omit or simplify what you do not need):

<script setup lang="ts">
import { getGenderDef } from '~~/app/utils/tools/gender'
import { getGender } from '~~/app/utils/tools/gender'
const getGenderClientTool = getGenderDef.client(getGender)
const chatRef = useTemplateRef('chatRef')
</script>

<template>
  <div class="flex min-h-0 flex-1 flex-col">
    <Chat
      :tools="[getGenderClientTool]"
      endpoint="/api/chat"
      assistant-name="Assistant"
      assistant-image="https://example.com/assistant.png"
      :user-name="user.name"
      :user-image="user.profile_picture"
      :show-tool-usage="true"
      :sticky-prompt="true"
    >
      <div class="flex flex-col gap-3 text-center text-muted text-sm">
        Pick a starter or type and send a message.
        <div class="flex flex-wrap gap-2 justify-center">
          <UButton
            size="sm"
            variant="soft"
            color="neutral"
            @click="chatRef!.sendMessage('Tell me a joke')"
          >
            Tell me a joke
          </UButton>
        </div>
      </div>
    </Chat>
  </div>
</template>

Chat props

Prop Type Default Description
tools readonly AnyClientTool[] [] Client tool instances from toolDefinition(...).client(...)
endpoint string '/api/chat' URL passed to fetchServerSentEvents (your Nitro route).
assistantName string 'Assistant' Shown on assistant bubbles.
assistantImage string Optional avatar URL for assistant.
userName string 'You' Shown on user bubbles.
userImage string Optional avatar URL for user.
showToolUsage boolean true When false, tool-call / tool-result parts are hidden.
stickyPrompt boolean true Sticky prompt textarea/composer ; set false for a simpler stacked layout.

Default slot

Chat accepts an optional default slot: content rendered when there are no chat messages. Use it for onboarding text, disclaimers, empty-state hints, or custom controls like new suggested chat prompts. If you pass no default slot, that region is omitted.

Exposed methods

Chat exposes these methods on a template ref (via defineExpose):

Method Purpose
sendMessage(message) Same as sending from the composer; appends a user message and runs the stream.
stop() Stops the in-flight response (same as the composer’s stop action).
reload() Retries the last request when the client is in an error state (same as the composer’s retry).
clear() TanStack useChat().clear() — wipes messages and related client state so you get an empty thread again (the default slot reappears when messages is empty).

Nitro API

Create a route file at:

server/api/chat.post.ts

That exposes POST /api/chat to the client (matching the default endpoint on Chat). Or you can use a custom api path and pass it as endpoint prop to the Chat component.

Adapt adapter, model name, env vars, and imports to your project. Tool imports below are placeholders; replace with your own toolDefinition exports. Refer to Tanstack AI documentation to learn more about adapters, tools, and advanced usage.

import { chat, toServerSentEventsResponse } from '@tanstack/ai'
import { createGeminiChat } from '@tanstack/ai-gemini'
import { getGenderDef } from '~~/app/utils/tools/gender'
// import { getGenderServer } from '~~/server/tools/gender'

export default defineEventHandler(async (event) => {
  const config = {
    vertexai: true,
  }
  const apiKey = process.env.NUXT_AI_VERTEX_API_KEY
  if (!apiKey) {
    throw createError({
      statusCode: 500,
      message: 'NUXT_AI_VERTEX_API_KEY not configured',
    })
  }
  const adapter = createGeminiChat('gemini-3.1-flash-lite-preview', apiKey, config)

  try {
    const { messages, conversationId } = await readBody(event)
    const stream = chat({
      adapter,
      messages,
      conversationId,
      systemPrompts: ["You are a helpful assistant"],
      tools: [getGenderDef], // for client tools
      // tools: [getGenderServer] // for server tools
    })

    setResponseHeaders(event, {
      'Content-Type': 'text/event-stream',
      'Cache-Control': 'no-cache, no-transform',
      'Connection': 'keep-alive',
    })

    return toServerSentEventsResponse(stream)
  } catch (error) {
    throw createError({
      statusCode: 500,
      message: error instanceof Error ? error.message : 'An error occurred',
    })
  }
})

Client vs server tools

  • Client tools: Pass client tools to Chat component and have their tool definition registered in the server API endpoint as well.
  • Server tools: Pass server-side tools in chat API endpoint.

See https://tanstack.com/ai/latest/docs/tools/tools to learn more about Tanstack AI tools.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors