Run local LLMs with Ollama
## Ollama MCP Server: Local LLM Inference The **Ollama MCP Server** integrates Ollama's local model runner into Google Antigravity. This tool enables running large language models locally on your machine, providing privacy, offline capability, and zero API costs. ### Why Ollama MCP? Ollama makes local AI accessible: - **Privacy First**: Data never leaves your machine - **Offline Ready**: Work without internet connection - **Zero Cost**: No API fees after download - **Easy Setup**: One command to get started - **Model Variety**: Llama, Mistral, Gemma, and more ### Key Features #### 1. Model Management ```bash # Download and run models ollama pull llama3.1:70b ollama pull codellama:34b ollama pull mistral:7b # List installed models ollama list # Run interactive chat ollama run llama3.1 ``` #### 2. API Access ```python import ollama # Generate completion response = ollama.generate( model="llama3.1", prompt="Explain microservices architecture" ) print(response["response"]) # Chat interface response = ollama.chat( model="llama3.1", messages=[ {"role": "user", "content": "Write a Python quicksort function"} ] ) ``` #### 3. Streaming ```python # Stream responses for better UX stream = ollama.chat( model="llama3.1", messages=[{"role": "user", "content": prompt}], stream=True ) for chunk in stream: print(chunk["message"]["content"], end="", flush=True) ``` ### Configuration ```json { "mcpServers": { "ollama": { "command": "npx", "args": ["-y", "@anthropic/mcp-ollama"], "env": { "OLLAMA_HOST": "http://localhost:11434", "OLLAMA_MODELS": "~/.ollama/models" } } } } ``` ### Use Cases **Private Development**: Work with AI without sending code to external services. **Offline Work**: Continue AI-assisted development without internet. **Cost Control**: Eliminate API costs for high-volume usage. The Ollama MCP Server brings private, local AI to Antigravity development.
{
"mcpServers": {
"ollama": {
"mcpServers": {
"ollama": {
"env": {
"OLLAMA_HOST": "http://localhost:11434"
},
"args": [
"-y",
"ollama-mcp"
],
"command": "npx"
}
}
}
}
}