AI SDK
Multi-provider AI with streaming and tool use — build chat interfaces, completions, and AI-powered features.
Read this when adding AI features to your app.
Useful for building chatbots, assistants, or AI-powered workflows.
Why Vercel AI SDK
The AI SDK provides a unified interface across AI providers. Write your code once, then switch between OpenAI, Anthropic, Google, or Mistral by changing a configuration — no code changes needed.
It handles streaming responses, tool/function calling, and provides React hooks that make building chat interfaces straightforward. The SDK is actively maintained by Vercel and has excellent TypeScript support.
Key Features
What the AI SDK provides:
Multi-Provider
One API, many models. OpenAI, Anthropic, Google, Mistral — switch providers easily.
Streaming
Real-time responses. Stream tokens as they're generated for responsive UIs.
Tool Use
Function calling. Let AI call your functions and use the results.
Catalyst Integration
Catalyst includes AI infrastructure ready to use:
lib/ai/AI utilities, provider config, and helpers
app/(app)/app/ai/chat/Example chat interface with streaming
app/(app)/app/ai/settings/AI settings and provider configuration
app/api/ai/API routes for AI endpoints
Quick Start
Server-side streaming
// app/api/chat/route.ts
import { streamText } from "ai"
import { openai } from "@ai-sdk/openai"
export async function POST(req: Request) {
const { messages } = await req.json()
const result = streamText({
model: openai("gpt-4-turbo"),
messages,
})
return result.toDataStreamResponse()
}React chat hook
"use client"
import { useChat } from "@ai-sdk/react"
export function Chat() {
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat()
return (
<form onSubmit={handleSubmit}>
{messages.map((m) => (
<div key={m.id}>
<strong>{m.role}:</strong> {m.content}
</div>
))}
<input value={input} onChange={handleInputChange} disabled={isLoading} />
</form>
)
}Tool calling
import { streamText, tool } from "ai"
import { z } from "zod"
const result = streamText({
model: openai("gpt-4-turbo"),
messages,
tools: {
getWeather: tool({
description: "Get weather for a location",
parameters: z.object({
location: z.string().describe("City name"),
}),
execute: async ({ location }) => {
// Call your weather API
return { temperature: 72, condition: "sunny" }
},
}),
},
})Supported Providers
Catalyst includes SDK packages for these providers:
OpenAI
GPT-4, GPT-3.5, etc.
OPENAI_API_KEYAnthropic
Claude 3, Claude 2
ANTHROPIC_API_KEYGemini Pro, Gemini Flash
GOOGLE_GENERATIVE_AI_API_KEYMistral
Mistral Large, Medium, Small
MISTRAL_API_KEYSet the appropriate API key in your environment to use each provider.
Configuration
Environment Variables
# .env.local OPENAI_API_KEY=sk-... ANTHROPIC_API_KEY=sk-ant-... GOOGLE_GENERATIVE_AI_API_KEY=... MISTRAL_API_KEY=... # Enable AI features NEXT_PUBLIC_AI_ENABLED=true
The NEXT_PUBLIC_AI_ENABLED flag controls whether AI routes appear in the app navigation.
Learn More
For AI Agents
Key rules:
- Use
streamTextfor server-side streaming - Use
useChathook for React chat interfaces - Define tools with Zod schemas for type safety
- Check
isAIEnabled()before rendering AI features - API routes go in
app/api/ai/ - Keep API keys in environment variables, never in code
Next Steps
Related packages: