Google Antigravity Directory

The #1 directory for Google Antigravity prompts, rules, workflows & MCP servers. Optimized for Gemini 3 agentic development.

Resources

PromptsMCP ServersAntigravity RulesGEMINI.md GuideBest Practices

Company

Submit PromptAntigravityAI.directory

Popular Prompts

Next.js 14 App RouterReact TypeScriptTypeScript AdvancedFastAPI GuideDocker Best Practices

Legal

Privacy PolicyTerms of ServiceContact Us
Featured on FazierVerified on Verified ToolsFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowFeatured on FazierVerified on Verified ToolsFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App Show

© 2026 Antigravity AI Directory. All rights reserved.

The #1 directory for Google Antigravity IDE

This website is not affiliated with, endorsed by, or associated with Google LLC. "Google" and "Gemini" are trademarks of Google LLC.

Antigravity AI Directory
PromptsMCPBest PracticesUse CasesLearn
Home
MCP Servers
LiteLLM Gateway MCP
layers

LiteLLM Gateway MCP MCP Server

Call 100+ LLM APIs via OpenAI format

litellmgatewaymulti-providercost-tracking

About

## LiteLLM Gateway MCP Server: Universal LLM Proxy The **LiteLLM Gateway MCP Server** integrates LiteLLM's unified API gateway into Google Antigravity. This proxy provides a single OpenAI-compatible interface to 100+ LLM providers, simplifying multi-model applications with load balancing, fallbacks, and cost tracking. ### Why LiteLLM MCP? LiteLLM unifies LLM access: - **100+ Providers**: OpenAI, Anthropic, Azure, and more - **OpenAI Compatible**: Same API for all providers - **Load Balancing**: Distribute across models - **Fallbacks**: Auto-retry on failures - **Cost Tracking**: Monitor spending per key ### Key Features #### 1. Unified API ```python from litellm import completion # Same API works for any provider response = completion( model="gpt-4-turbo", # or claude-3-opus, gemini-pro, etc. messages=[{"role": "user", "content": "Hello!"}] ) # Switch models without code changes response = completion( model="claude-3-opus-20240229", messages=[{"role": "user", "content": "Hello!"}] ) ``` #### 2. Load Balancing ```python from litellm import Router router = Router( model_list=[ {"model_name": "gpt-4", "litellm_params": {"model": "gpt-4-turbo"}}, {"model_name": "gpt-4", "litellm_params": {"model": "azure/gpt-4"}}, ], routing_strategy="least-busy" ) response = router.completion( model="gpt-4", messages=[{"role": "user", "content": prompt}] ) ``` #### 3. Fallbacks ```python response = completion( model="gpt-4-turbo", messages=[{"role": "user", "content": prompt}], fallbacks=["claude-3-opus", "gemini-pro"], num_retries=3 ) ``` ### Configuration ```json { "mcpServers": { "litellm-gateway": { "command": "npx", "args": ["-y", "@anthropic/mcp-litellm"], "env": { "OPENAI_API_KEY": "your-key", "ANTHROPIC_API_KEY": "your-key", "LITELLM_MASTER_KEY": "sk-master" } } } } ``` ### Use Cases **Multi-Provider**: Use the best model for each task without API changes. **Reliability**: Automatic failover ensures consistent uptime. **Cost Optimization**: Route to cheaper models when appropriate. The LiteLLM Gateway MCP Server provides unified LLM access for Antigravity.

Installation

Configuration
{
  "mcpServers": {
    "litellm-gateway": {
      "mcpServers": {
        "litellm": {
          "env": {
            "LITELLM_API_KEY": "YOUR_KEY"
          },
          "args": [
            "litellm-mcp"
          ],
          "command": "uvx"
        }
      }
    }
  }
}

How to Use

  1. 1100+ LLM providers supported
  2. 2Cost tracking and guardrails
  3. 3OpenAI-compatible format

Related MCP Servers

🧰

Toolhouse MCP

Universal AI tool platform that equips your AI with production-ready capabilities. Execute code, browse the web, manage files, send emails, and more through a unified MCP interface.

🔨

Smithery Registry MCP

The MCP server registry and discovery platform. Browse, search, and install MCP servers from the community. Find the perfect integrations for your AI development workflow.

🔍

MCP Inspector

Official debugging and testing tool for MCP servers. Inspect server capabilities, test tool calls, validate responses, and debug protocol communication in real-time.

← Back to All MCP Servers