Call 100+ LLM APIs via OpenAI format
## LiteLLM Gateway MCP Server: Universal LLM Proxy The **LiteLLM Gateway MCP Server** integrates LiteLLM's unified API gateway into Google Antigravity. This proxy provides a single OpenAI-compatible interface to 100+ LLM providers, simplifying multi-model applications with load balancing, fallbacks, and cost tracking. ### Why LiteLLM MCP? LiteLLM unifies LLM access: - **100+ Providers**: OpenAI, Anthropic, Azure, and more - **OpenAI Compatible**: Same API for all providers - **Load Balancing**: Distribute across models - **Fallbacks**: Auto-retry on failures - **Cost Tracking**: Monitor spending per key ### Key Features #### 1. Unified API ```python from litellm import completion # Same API works for any provider response = completion( model="gpt-4-turbo", # or claude-3-opus, gemini-pro, etc. messages=[{"role": "user", "content": "Hello!"}] ) # Switch models without code changes response = completion( model="claude-3-opus-20240229", messages=[{"role": "user", "content": "Hello!"}] ) ``` #### 2. Load Balancing ```python from litellm import Router router = Router( model_list=[ {"model_name": "gpt-4", "litellm_params": {"model": "gpt-4-turbo"}}, {"model_name": "gpt-4", "litellm_params": {"model": "azure/gpt-4"}}, ], routing_strategy="least-busy" ) response = router.completion( model="gpt-4", messages=[{"role": "user", "content": prompt}] ) ``` #### 3. Fallbacks ```python response = completion( model="gpt-4-turbo", messages=[{"role": "user", "content": prompt}], fallbacks=["claude-3-opus", "gemini-pro"], num_retries=3 ) ``` ### Configuration ```json { "mcpServers": { "litellm-gateway": { "command": "npx", "args": ["-y", "@anthropic/mcp-litellm"], "env": { "OPENAI_API_KEY": "your-key", "ANTHROPIC_API_KEY": "your-key", "LITELLM_MASTER_KEY": "sk-master" } } } } ``` ### Use Cases **Multi-Provider**: Use the best model for each task without API changes. **Reliability**: Automatic failover ensures consistent uptime. **Cost Optimization**: Route to cheaper models when appropriate. The LiteLLM Gateway MCP Server provides unified LLM access for Antigravity.
{
"mcpServers": {
"litellm-gateway": {
"mcpServers": {
"litellm": {
"env": {
"LITELLM_API_KEY": "YOUR_KEY"
},
"args": [
"litellm-mcp"
],
"command": "uvx"
}
}
}
}
}