Build AI applications with Vercel AI SDK for Google Antigravity IDE
# Vercel AI SDK Patterns for Google Antigravity
Master AI application development with the Vercel AI SDK in Google Antigravity IDE. This comprehensive guide covers streaming responses, tool calling, structured outputs, multi-modal inputs, and provider-agnostic patterns for building intelligent applications.
## Configuration
Configure your Antigravity environment for AI SDK:
```typescript
// .antigravity/ai-sdk.ts
export const aiConfig = {
providers: ["openai", "anthropic", "google"],
streaming: true,
features: {
toolCalling: true,
structuredOutput: true,
multiModal: true
}
};
```
## Basic Chat Implementation
Create streaming chat interfaces:
```typescript
// app/api/chat/route.ts
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
export async function POST(request: Request) {
const { messages } = await request.json();
const result = await streamText({
model: openai("gpt-4-turbo"),
system: "You are a helpful assistant.",
messages,
maxTokens: 1000,
temperature: 0.7
});
return result.toDataStreamResponse();
}
// components/Chat.tsx
"use client";
import { useChat } from "ai/react";
export function Chat() {
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat({
api: "/api/chat"
});
return (
<div className="chat-container">
<div className="messages">
{messages.map((m) => (
<div key={m.id} className={`message ${m.role}`}>
{m.content}
</div>
))}
</div>
<form onSubmit={handleSubmit}>
<input
value={input}
onChange={handleInputChange}
placeholder="Type a message..."
disabled={isLoading}
/>
<button type="submit" disabled={isLoading}>
Send
</button>
</form>
</div>
);
}
```
## Tool Calling Pattern
Implement AI tools for function calling:
```typescript
import { openai } from "@ai-sdk/openai";
import { streamText, tool } from "ai";
import { z } from "zod";
const weatherTool = tool({
description: "Get the current weather for a location",
parameters: z.object({
location: z.string().describe("The city and state"),
unit: z.enum(["celsius", "fahrenheit"]).default("celsius")
}),
execute: async ({ location, unit }) => {
const weather = await fetchWeather(location);
return {
temperature: unit === "celsius" ? weather.tempC : weather.tempF,
condition: weather.condition,
humidity: weather.humidity
};
}
});
const searchTool = tool({
description: "Search the web for information",
parameters: z.object({
query: z.string().describe("The search query")
}),
execute: async ({ query }) => {
const results = await searchWeb(query);
return results.slice(0, 5);
}
});
export async function POST(request: Request) {
const { messages } = await request.json();
const result = await streamText({
model: openai("gpt-4-turbo"),
messages,
tools: {
weather: weatherTool,
search: searchTool
},
maxSteps: 5
});
return result.toDataStreamResponse();
}
```
## Structured Output
Generate type-safe structured data:
```typescript
import { openai } from "@ai-sdk/openai";
import { generateObject } from "ai";
import { z } from "zod";
const recipeSchema = z.object({
name: z.string(),
description: z.string(),
prepTime: z.number().describe("Preparation time in minutes"),
cookTime: z.number().describe("Cooking time in minutes"),
servings: z.number(),
ingredients: z.array(z.object({
item: z.string(),
amount: z.string(),
unit: z.string().optional()
})),
instructions: z.array(z.string()),
nutritionInfo: z.object({
calories: z.number(),
protein: z.number(),
carbs: z.number(),
fat: z.number()
}).optional()
});
export async function POST(request: Request) {
const { prompt } = await request.json();
const { object } = await generateObject({
model: openai("gpt-4-turbo"),
schema: recipeSchema,
prompt: `Create a detailed recipe for: ${prompt}`
});
return Response.json(object);
}
```
## Multi-Modal Input
Handle images and text together:
```typescript
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";
export async function POST(request: Request) {
const formData = await request.formData();
const image = formData.get("image") as File;
const prompt = formData.get("prompt") as string;
const imageBuffer = await image.arrayBuffer();
const base64Image = Buffer.from(imageBuffer).toString("base64");
const { text } = await generateText({
model: openai("gpt-4-vision-preview"),
messages: [
{
role: "user",
content: [
{ type: "text", text: prompt },
{
type: "image",
image: `data:${image.type};base64,${base64Image}`
}
]
}
]
});
return Response.json({ analysis: text });
}
```
## Provider Switching
Support multiple AI providers:
```typescript
import { openai } from "@ai-sdk/openai";
import { anthropic } from "@ai-sdk/anthropic";
import { google } from "@ai-sdk/google";
import { streamText } from "ai";
const providers = {
openai: openai("gpt-4-turbo"),
anthropic: anthropic("claude-3-sonnet-20240229"),
google: google("gemini-pro")
};
export async function POST(request: Request) {
const { messages, provider = "openai" } = await request.json();
const model = providers[provider as keyof typeof providers];
const result = await streamText({
model,
messages,
maxTokens: 1000
});
return result.toDataStreamResponse();
}
```
## Best Practices
Follow these guidelines for AI SDK development:
1. **Stream responses** - Better user experience
2. **Use structured output** - Type-safe AI responses
3. **Implement tools** - Extend AI capabilities
4. **Handle errors** - Graceful degradation
5. **Rate limit** - Protect API costs
6. **Cache responses** - Reduce redundant calls
Google Antigravity IDE provides intelligent AI SDK suggestions and streaming pattern assistance.This AI prompt is ideal for developers working on:
By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their ai implementations.
Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.
This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.
You can modify the prompt by adding specific requirements, constraints, or preferences. For AI projects, consider mentioning your framework version, coding style, and any specific libraries you're using.