Google Antigravity Directory

The #1 directory for Google Antigravity prompts, rules, workflows & MCP servers. Optimized for Gemini 3 agentic development.

Resources

PromptsMCP ServersAntigravity RulesGEMINI.md GuideBest Practices

Company

Submit PromptAntigravityAI.directory

Popular Prompts

Next.js 14 App RouterReact TypeScriptTypeScript AdvancedFastAPI GuideDocker Best Practices

Legal

Privacy PolicyTerms of ServiceContact Us
Featured on FazierFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowAI ToolzShinyLaunchMillion Dot HomepageSolver ToolsFeatured on FazierFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowAI ToolzShinyLaunchMillion Dot HomepageSolver Tools

© 2026 Antigravity AI Directory. All rights reserved.

The #1 directory for Google Antigravity IDE

This website is not affiliated with, endorsed by, or associated with Google LLC. "Google" and "Gemini" are trademarks of Google LLC.

Antigravity AI Directory
PromptsMCPBest PracticesUse CasesLearn
Home
Prompts
OpenAI Assistants API Patterns

OpenAI Assistants API Patterns

Build AI assistants with OpenAI Assistants API for Google Antigravity projects including threads, tools, and file handling.

openaiassistantsaichatbotapi
by Antigravity Team
⭐0Stars
.antigravity
# OpenAI Assistants API Patterns for Google Antigravity

Build intelligent AI assistants with OpenAI Assistants API in your Google Antigravity IDE projects. This comprehensive guide covers assistant creation, threads, tool usage, and file handling optimized for Gemini 3 agentic development.

## Assistant Configuration

Set up the OpenAI client and create assistants:

```typescript
// lib/openai.ts
import OpenAI from 'openai';

export const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
});

// Assistant configuration
export const ASSISTANT_CONFIG = {
  name: 'Antigravity Code Assistant',
  instructions: `You are an expert coding assistant for Google Antigravity IDE.
    Your role is to help developers write better code, debug issues, and learn best practices.
    
    Guidelines:
    - Provide clear, concise explanations
    - Include code examples with proper TypeScript types
    - Reference best practices and design patterns
    - Suggest improvements when appropriate
    - Be honest about limitations`,
  model: 'gpt-4-turbo-preview',
  tools: [
    { type: 'code_interpreter' },
    { type: 'retrieval' },
    {
      type: 'function',
      function: {
        name: 'search_prompts',
        description: 'Search the Antigravity prompt library for relevant prompts',
        parameters: {
          type: 'object',
          properties: {
            query: {
              type: 'string',
              description: 'The search query',
            },
            category: {
              type: 'string',
              description: 'Optional category filter',
            },
          },
          required: ['query'],
        },
      },
    },
  ],
} as const;

// Create or retrieve assistant
export async function getOrCreateAssistant(): Promise<OpenAI.Assistant> {
  const assistants = await openai.beta.assistants.list({ limit: 100 });
  
  const existing = assistants.data.find(
    (a) => a.name === ASSISTANT_CONFIG.name
  );
  
  if (existing) {
    return existing;
  }
  
  return openai.beta.assistants.create(ASSISTANT_CONFIG);
}
```

## Thread Management

Handle conversation threads:

```typescript
// lib/threads.ts
import { openai } from './openai';

// Create a new thread
export async function createThread(metadata?: Record<string, string>) {
  return openai.beta.threads.create({
    metadata,
  });
}

// Add a message to a thread
export async function addMessage(
  threadId: string,
  content: string,
  fileIds?: string[]
) {
  return openai.beta.threads.messages.create(threadId, {
    role: 'user',
    content,
    attachments: fileIds?.map((fileId) => ({
      file_id: fileId,
      tools: [{ type: 'code_interpreter' }],
    })),
  });
}

// Get thread messages
export async function getMessages(threadId: string, limit: number = 20) {
  const response = await openai.beta.threads.messages.list(threadId, {
    limit,
    order: 'desc',
  });
  
  return response.data.reverse();
}

// Delete a thread
export async function deleteThread(threadId: string) {
  return openai.beta.threads.del(threadId);
}
```

## Running Assistants

Execute assistant runs with streaming:

```typescript
// lib/runs.ts
import { openai, getOrCreateAssistant } from './openai';
import { searchPrompts } from './prompts';

interface RunOptions {
  threadId: string;
  onMessage?: (message: string) => void;
  onToolCall?: (name: string, args: unknown) => void;
  onComplete?: (messages: OpenAI.Message[]) => void;
  onError?: (error: Error) => void;
}

export async function runAssistant({
  threadId,
  onMessage,
  onToolCall,
  onComplete,
  onError,
}: RunOptions) {
  try {
    const assistant = await getOrCreateAssistant();
    
    const stream = openai.beta.threads.runs.stream(threadId, {
      assistant_id: assistant.id,
    });

    // Handle streaming events
    stream
      .on('textDelta', (delta) => {
        if (delta.value) {
          onMessage?.(delta.value);
        }
      })
      .on('toolCallCreated', (toolCall) => {
        onToolCall?.(toolCall.type, toolCall);
      })
      .on('event', async (event) => {
        // Handle tool calls that require action
        if (
          event.event === 'thread.run.requires_action' &&
          event.data.required_action?.type === 'submit_tool_outputs'
        ) {
          const toolCalls = event.data.required_action.submit_tool_outputs.tool_calls;
          const toolOutputs = await processToolCalls(toolCalls);
          
          await openai.beta.threads.runs.submitToolOutputsStream(
            threadId,
            event.data.id,
            { tool_outputs: toolOutputs }
          );
        }
      })
      .on('error', (error) => {
        onError?.(error);
      });

    // Wait for completion
    const finalRun = await stream.finalRun();
    
    if (finalRun.status === 'completed') {
      const messages = await openai.beta.threads.messages.list(threadId);
      onComplete?.(messages.data);
    }

    return finalRun;
  } catch (error) {
    onError?.(error as Error);
    throw error;
  }
}

// Process tool calls
async function processToolCalls(
  toolCalls: OpenAI.RequiredActionFunctionToolCall[]
): Promise<OpenAI.RunSubmitToolOutputsParams.ToolOutput[]> {
  return Promise.all(
    toolCalls.map(async (toolCall) => {
      const args = JSON.parse(toolCall.function.arguments);
      let output: string;

      switch (toolCall.function.name) {
        case 'search_prompts':
          const results = await searchPrompts(args.query, args.category);
          output = JSON.stringify(results);
          break;
        default:
          output = JSON.stringify({ error: 'Unknown function' });
      }

      return {
        tool_call_id: toolCall.id,
        output,
      };
    })
  );
}
```

## File Handling

Upload and manage files for assistants:

```typescript
// lib/files.ts
import { openai } from './openai';

// Upload a file for the assistant
export async function uploadFile(
  file: File | Buffer,
  filename: string,
  purpose: 'assistants' | 'vision' = 'assistants'
) {
  const fileObject = file instanceof Buffer
    ? new File([file], filename)
    : file;

  return openai.files.create({
    file: fileObject,
    purpose,
  });
}

// Upload code files for analysis
export async function uploadCodeFiles(files: { name: string; content: string }[]) {
  const uploadedFiles = await Promise.all(
    files.map(async (file) => {
      const buffer = Buffer.from(file.content, 'utf-8');
      return uploadFile(buffer, file.name);
    })
  );

  return uploadedFiles.map((f) => f.id);
}

// Delete a file
export async function deleteFile(fileId: string) {
  return openai.files.del(fileId);
}

// List files
export async function listFiles(purpose?: 'assistants' | 'vision') {
  return openai.files.list({ purpose });
}

// Retrieve file content
export async function getFileContent(fileId: string) {
  const response = await openai.files.content(fileId);
  return response.text();
}
```

## API Route Integration

Create API routes for the assistant:

```typescript
// app/api/assistant/chat/route.ts
import { NextRequest } from 'next/server';
import { createThread, addMessage, getMessages } from '@/lib/threads';
import { runAssistant } from '@/lib/runs';

export async function POST(request: NextRequest) {
  const { message, threadId: existingThreadId } = await request.json();

  // Create or use existing thread
  const threadId = existingThreadId || (await createThread()).id;

  // Add user message
  await addMessage(threadId, message);

  // Create streaming response
  const encoder = new TextEncoder();
  const stream = new ReadableStream({
    async start(controller) {
      await runAssistant({
        threadId,
        onMessage: (text) => {
          controller.enqueue(
            encoder.encode(`data: ${JSON.stringify({ type: 'text', content: text })}

`)
          );
        },
        onToolCall: (name, args) => {
          controller.enqueue(
            encoder.encode(`data: ${JSON.stringify({ type: 'tool', name, args })}

`)
          );
        },
        onComplete: () => {
          controller.enqueue(
            encoder.encode(`data: ${JSON.stringify({ type: 'done', threadId })}

`)
          );
          controller.close();
        },
        onError: (error) => {
          controller.enqueue(
            encoder.encode(`data: ${JSON.stringify({ type: 'error', message: error.message })}

`)
          );
          controller.close();
        },
      });
    },
  });

  return new Response(stream, {
    headers: {
      'Content-Type': 'text/event-stream',
      'Cache-Control': 'no-cache',
      Connection: 'keep-alive',
    },
  });
}

// GET messages from a thread
export async function GET(request: NextRequest) {
  const threadId = request.nextUrl.searchParams.get('threadId');
  
  if (!threadId) {
    return Response.json({ error: 'Thread ID required' }, { status: 400 });
  }

  const messages = await getMessages(threadId);
  return Response.json({ messages });
}
```

## Best Practices

1. **Reuse assistants** instead of creating new ones for each user
2. **Use streaming** for responsive user experiences
3. **Implement tool functions** for extended capabilities
4. **Handle errors gracefully** with proper fallbacks
5. **Clean up old threads** to manage storage
6. **Use file attachments** for code analysis tasks
7. **Monitor usage** to control API costs

When to Use This Prompt

This openai prompt is ideal for developers working on:

  • openai applications requiring modern best practices and optimal performance
  • Projects that need production-ready openai code with proper error handling
  • Teams looking to standardize their openai development workflow
  • Developers wanting to learn industry-standard openai patterns and techniques

By using this prompt, you can save hours of manual coding and ensure best practices are followed from the start. It's particularly valuable for teams looking to maintain consistency across their openai implementations.

How to Use

  1. Copy the prompt - Click the copy button above to copy the entire prompt to your clipboard
  2. Paste into your AI assistant - Use with Claude, ChatGPT, Cursor, or any AI coding tool
  3. Customize as needed - Adjust the prompt based on your specific requirements
  4. Review the output - Always review generated code for security and correctness
💡 Pro Tip: For best results, provide context about your project structure and any specific constraints or preferences you have.

Best Practices

  • ✓ Always review generated code for security vulnerabilities before deploying
  • ✓ Test the openai code in a development environment first
  • ✓ Customize the prompt output to match your project's coding standards
  • ✓ Keep your AI assistant's context window in mind for complex requirements
  • ✓ Version control your prompts alongside your code for reproducibility

Frequently Asked Questions

Can I use this openai prompt commercially?

Yes! All prompts on Antigravity AI Directory are free to use for both personal and commercial projects. No attribution required, though it's always appreciated.

Which AI assistants work best with this prompt?

This prompt works excellently with Claude, ChatGPT, Cursor, GitHub Copilot, and other modern AI coding assistants. For best results, use models with large context windows.

How do I customize this prompt for my specific needs?

You can modify the prompt by adding specific requirements, constraints, or preferences. For openai projects, consider mentioning your framework version, coding style, and any specific libraries you're using.

Related Prompts

💬 Comments

Loading comments...