Google Antigravity Directory

The #1 directory for Google Antigravity prompts, rules, workflows & MCP servers. Optimized for Gemini 3 agentic development.

Resources

PromptsMCP ServersAntigravity RulesGEMINI.md GuideBest Practices

Company

Submit PromptAntigravityAI.directory

Popular Prompts

Next.js 14 App RouterReact TypeScriptTypeScript AdvancedFastAPI GuideDocker Best Practices

Legal

Privacy PolicyTerms of ServiceContact Us
Featured on FazierVerified on Verified ToolsFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App ShowFeatured on FazierVerified on Verified ToolsFeatured on WayfindioAntigravity AI - Featured on Startup FameFeatured on Wired BusinessFeatured on Twelve ToolsListed on Turbo0Featured on findly.toolsFeatured on Aura++That App Show

© 2026 Antigravity AI Directory. All rights reserved.

The #1 directory for Google Antigravity IDE

This website is not affiliated with, endorsed by, or associated with Google LLC. "Google" and "Gemini" are trademarks of Google LLC.

Antigravity AI Directory
PromptsMCPBest PracticesUse CasesLearn
Home
MCP Servers
Hugging Face
box

Hugging Face MCP Server

Official Hugging Face MCP server

huggingfacemlmodelsdatasetsai

About

## Hugging Face MCP Server: ML Model Hub The **Hugging Face MCP Server** integrates the world's largest ML model repository into Google Antigravity. This platform hosts hundreds of thousands of models, datasets, and Spaces, making it the go-to destination for machine learning resources. ### Why Hugging Face MCP? Hugging Face is the GitHub of ML: - **Massive Repository**: 500K+ models available - **Transformers**: Industry-standard library - **Datasets**: Thousands of datasets ready to use - **Spaces**: Host ML demos instantly - **Community**: Vibrant ML community ### Key Features #### 1. Model Inference ```python from huggingface_hub import InferenceClient client = InferenceClient(token="your-token") # Text generation response = client.text_generation( model="meta-llama/Llama-3.1-70B-Instruct", prompt="Explain quantum computing:", max_new_tokens=500 ) # Embeddings embeddings = client.feature_extraction( model="sentence-transformers/all-MiniLM-L6-v2", inputs="Machine learning is fascinating" ) ``` #### 2. Model Downloads ```python from transformers import AutoModelForCausalLM, AutoTokenizer # Download and cache models locally model_name = "microsoft/DialoGPT-medium" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name) # Use locally inputs = tokenizer.encode("Hello, how are you?", return_tensors="pt") outputs = model.generate(inputs, max_length=100) ``` #### 3. Dataset Access ```python from datasets import load_dataset # Load datasets easily dataset = load_dataset("squad", split="train") for example in dataset.take(5): print(f"Q: {example['question']}") print(f"A: {example['answers']['text'][0]}") ``` ### Configuration ```json { "mcpServers": { "huggingface": { "command": "npx", "args": ["-y", "@anthropic/mcp-huggingface"], "env": { "HF_TOKEN": "hf_your-token", "HF_HOME": "~/.cache/huggingface" } } } } ``` ### Use Cases **Model Discovery**: Find and test models for any ML task. **Rapid Prototyping**: Download pre-trained models for quick experimentation. **Dataset Preparation**: Access and preprocess datasets for training. The Hugging Face MCP Server brings the ML ecosystem to Antigravity.

Installation

Configuration
{
  "mcpServers": {
    "huggingface": {
      "mcpServers": {
        "huggingface": {
          "env": {
            "HF_TOKEN": "YOUR_HUGGINGFACE_TOKEN"
          },
          "args": [
            "hf-mcp-server"
          ],
          "command": "uvx"
        }
      }
    }
  }
}

How to Use

  1. 11. Get token from huggingface.co/settings/tokens
  2. 22. Search models, datasets, Spaces, and papers
  3. 33. Run community tools via Gradio Spaces

Related MCP Servers

🧰

Toolhouse MCP

Universal AI tool platform that equips your AI with production-ready capabilities. Execute code, browse the web, manage files, send emails, and more through a unified MCP interface.

🔨

Smithery Registry MCP

The MCP server registry and discovery platform. Browse, search, and install MCP servers from the community. Find the perfect integrations for your AI development workflow.

🔍

MCP Inspector

Official debugging and testing tool for MCP servers. Inspect server capabilities, test tool calls, validate responses, and debug protocol communication in real-time.

← Back to All MCP Servers