Server & Framework Adapters
Vue 3

Vue 3

@react-ai-stream/vue provides a useAIChat composable with the same API surface as the React hook. It uses Vue's shallowRef for reactive state and onUnmounted for cleanup — no Vuex, no Pinia, no global store.

Install

npm install @react-ai-stream/vue

Usage

<script setup lang="ts">
import { ref } from 'vue'
import { useAIChat } from '@react-ai-stream/vue'
 
const { messages, sendMessage, loading, stop, error } = useAIChat({
  endpoint: '/api/chat',
})
 
const input = ref('')
 
function handleSubmit() {
  if (!input.value.trim() || loading.value) return
  sendMessage(input.value)
  input.value = ''
}
</script>
 
<template>
  <div class="chat">
    <ul>
      <li v-for="m in messages.filter(m => m.role !== 'system')" :key="m.id">
        <strong>{{ m.role }}:</strong> {{ m.content }}
      </li>
      <li v-if="loading">Thinking…</li>
    </ul>
 
    <p v-if="error" class="error">{{ error }}</p>
 
    <form @submit.prevent="handleSubmit">
      <input v-model="input" :disabled="loading" placeholder="Ask anything…" />
      <button v-if="loading" type="button" @click="stop">Stop</button>
      <button v-else type="submit">Send</button>
    </form>
  </div>
</template>

Vue auto-unwraps shallowRef values inside <template>, so you use messages directly in the template but messages.value in <script setup>.

API

useAIChat(options)

Options — same as the React hook:

OptionTypeDescription
endpointstringYour RAIS-compliant API route
clientAIClientPre-built client from createAIClient
provider'openai' | 'anthropic'Direct provider (dev only — keeps key in browser)
apiKeystringAPI key for direct provider mode
onToken(token: string) => voidFires per token
onComplete(message: Message) => voidFires when stream ends
onError(error: Error) => voidFires on error

Returns:

ValueTypeDescription
messagesShallowRef<Message[]>Full conversation history
loadingShallowRef<boolean>True while streaming
errorShallowRef<string | null>Last error message
sendMessage(content: string) => Promise<void>Send a user message
stop() => voidAbort the current stream
clearMessages() => voidReset the conversation

Multiple instances

Like the React hook, each useAIChat call is completely isolated — its own state, its own abort controller. You can run multiple models in parallel:

<script setup>
import { useAIChat } from '@react-ai-stream/vue'
 
const gpt = useAIChat({ endpoint: '/api/chat?model=gpt-4o-mini' })
const claude = useAIChat({ endpoint: '/api/chat?model=claude-sonnet-4-6' })
</script>

Server

The Vue composable talks to the same RAIS-compliant endpoints as the React hook. Use any backend: