API Reference
useAIChat

useAIChat

The core hook. Manages messages, streaming state, and abort.

import { useAIChat } from '@react-ai-stream/react'
 
const { messages, sendMessage, loading, stop, error, clearMessages } = useAIChat(options)

Options

Endpoint mode (recommended for production)

OptionTypeRequiredDescription
endpointstringyesURL of your streaming API route
headersRecord<string, string>noExtra headers sent with every request
bodyRecord<string, unknown>noExtra fields merged into every request body
const chat = useAIChat({
  endpoint: '/api/chat',
  headers: { 'X-Session-Id': sessionId },
  body: { persona: 'support-agent', temperature: 0.7 },
})

Direct provider mode

⚠️

Direct provider mode exposes your API key in the browser. Use only for local development or trusted environments.

OptionTypeRequiredDescription
provider'openai' | 'anthropic'yesProvider name
apiKeystringyesAPI key
modelstringnoModel name (provider default if omitted)
baseURLstringnoOverride base URL (OpenAI-compatible APIs)
maxTokensnumbernoMax tokens (Anthropic only)
systemstringnoSystem prompt
// Anthropic direct
const chat = useAIChat({
  provider: 'anthropic',
  apiKey: process.env.NEXT_PUBLIC_ANTHROPIC_API_KEY!,
  model: 'claude-sonnet-4-6',
  maxTokens: 2048,
  system: 'You are a helpful assistant.',
})
 
// OpenAI-compatible (Groq, Together, etc.)
const chat = useAIChat({
  provider: 'openai',
  apiKey: process.env.NEXT_PUBLIC_GROQ_API_KEY!,
  baseURL: 'https://api.groq.com/openai/v1',
  model: 'llama-3.3-70b-versatile',
})

Client mode

Bring a pre-built client, for example from AIChatProvider context:

OptionTypeDescription
clientAIClientPre-built client from createAIClient()
import { createAIClient } from '@react-ai-stream/core'
 
const client = createAIClient({ endpoint: '/api/chat' })
 
function MyComponent() {
  const chat = useAIChat({ client })
}

Callbacks

OptionTypeDescription
onToken(token: string) => voidCalled for each streamed text delta
onComplete(message: Message) => voidCalled once when streaming ends
onError(error: Error) => voidCalled on stream or provider errors
const chat = useAIChat({
  endpoint: '/api/chat',
  onToken: (token) => setWordCount((n) => n + token.split(/\s+/).length),
  onComplete: (message) => db.save(message),
  onError: (err) => Sentry.captureException(err),
})

Return values

ValueTypeDescription
messagesMessage[]Full conversation history
sendMessage(content: string) => Promise<void>Send a user message and start streaming
loadingbooleantrue while a stream is in progress
stop() => voidAbort the in-flight stream
errorstring | nullLast error message, null if no error
clearMessages() => voidReset the conversation (does not abort)

Message shape

interface Message {
  id: string       // crypto.randomUUID()
  role: 'user' | 'assistant' | 'system' | 'tool'
  content: string  // full text (grows during streaming)
  createdAt: Date
}

Behavior notes

  • sendMessage is a no-op if loading is true
  • stop() preserves the partial response in messages
  • clearMessages() does not abort an in-flight stream; call stop() first if needed
  • The AI client is created lazily on the first sendMessage call and cached until endpoint, provider, apiKey, or the context client changes
  • On unmount, any in-flight stream is automatically aborted

TypeScript

All types are exported from @react-ai-stream/core:

import type {
  Message,
  UseAIChatOptions,
  UseAIChatCallbacks,
  UseAIChatReturn,
  AIClient,
  CreateClientOptions,
} from '@react-ai-stream/core'