Google Antigravity Directory

The #1 directory for Google Antigravity prompts, rules, workflows & MCP servers. Optimized for Gemini 3 agentic development.

Resources

PromptsMCP ServersAntigravity RulesGEMINI.md GuideBest Practices

Company

Submit PromptAntigravityAI.directory

Popular Prompts

Next.js 14 App RouterReact TypeScriptTypeScript AdvancedFastAPI GuideDocker Best Practices

Legal

Privacy PolicyTerms of ServiceContact Us
Featured on FazierVerified on Verified ToolsFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowFeatured on FazierVerified on Verified ToolsFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App Show

© 2026 Antigravity AI Directory. All rights reserved.

The #1 directory for Google Antigravity IDE

This website is not affiliated with, endorsed by, or associated with Google LLC. "Google" and "Gemini" are trademarks of Google LLC.

Antigravity AI Directory
PromptsMCPBest PracticesUse CasesLearn
Home
MCP Servers
Ollama Local LLM MCP Server
server

Ollama Local LLM MCP Server MCP Server

Run local LLMs with Ollama

ollamalocalllmprivacy

About

## Ollama MCP Server: Local LLM Inference The **Ollama MCP Server** integrates Ollama's local model runner into Google Antigravity. This tool enables running large language models locally on your machine, providing privacy, offline capability, and zero API costs. ### Why Ollama MCP? Ollama makes local AI accessible: - **Privacy First**: Data never leaves your machine - **Offline Ready**: Work without internet connection - **Zero Cost**: No API fees after download - **Easy Setup**: One command to get started - **Model Variety**: Llama, Mistral, Gemma, and more ### Key Features #### 1. Model Management ```bash # Download and run models ollama pull llama3.1:70b ollama pull codellama:34b ollama pull mistral:7b # List installed models ollama list # Run interactive chat ollama run llama3.1 ``` #### 2. API Access ```python import ollama # Generate completion response = ollama.generate( model="llama3.1", prompt="Explain microservices architecture" ) print(response["response"]) # Chat interface response = ollama.chat( model="llama3.1", messages=[ {"role": "user", "content": "Write a Python quicksort function"} ] ) ``` #### 3. Streaming ```python # Stream responses for better UX stream = ollama.chat( model="llama3.1", messages=[{"role": "user", "content": prompt}], stream=True ) for chunk in stream: print(chunk["message"]["content"], end="", flush=True) ``` ### Configuration ```json { "mcpServers": { "ollama": { "command": "npx", "args": ["-y", "@anthropic/mcp-ollama"], "env": { "OLLAMA_HOST": "http://localhost:11434", "OLLAMA_MODELS": "~/.ollama/models" } } } } ``` ### Use Cases **Private Development**: Work with AI without sending code to external services. **Offline Work**: Continue AI-assisted development without internet. **Cost Control**: Eliminate API costs for high-volume usage. The Ollama MCP Server brings private, local AI to Antigravity development.

Installation

Configuration
{
  "mcpServers": {
    "ollama": {
      "mcpServers": {
        "ollama": {
          "env": {
            "OLLAMA_HOST": "http://localhost:11434"
          },
          "args": [
            "-y",
            "ollama-mcp"
          ],
          "command": "npx"
        }
      }
    }
  }
}

How to Use

  1. 1Install Ollama locally
  2. 2Download models with ollama pull
  3. 3No API key needed

Related MCP Servers

🧰

Toolhouse MCP

Universal AI tool platform that equips your AI with production-ready capabilities. Execute code, browse the web, manage files, send emails, and more through a unified MCP interface.

🔨

Smithery Registry MCP

The MCP server registry and discovery platform. Browse, search, and install MCP servers from the community. Find the perfect integrations for your AI development workflow.

🔍

MCP Inspector

Official debugging and testing tool for MCP servers. Inspect server capabilities, test tool calls, validate responses, and debug protocol communication in real-time.

← Back to All MCP Servers