Official Hugging Face MCP server
## Hugging Face MCP Server: ML Model Hub The **Hugging Face MCP Server** integrates the world's largest ML model repository into Google Antigravity. This platform hosts hundreds of thousands of models, datasets, and Spaces, making it the go-to destination for machine learning resources. ### Why Hugging Face MCP? Hugging Face is the GitHub of ML: - **Massive Repository**: 500K+ models available - **Transformers**: Industry-standard library - **Datasets**: Thousands of datasets ready to use - **Spaces**: Host ML demos instantly - **Community**: Vibrant ML community ### Key Features #### 1. Model Inference ```python from huggingface_hub import InferenceClient client = InferenceClient(token="your-token") # Text generation response = client.text_generation( model="meta-llama/Llama-3.1-70B-Instruct", prompt="Explain quantum computing:", max_new_tokens=500 ) # Embeddings embeddings = client.feature_extraction( model="sentence-transformers/all-MiniLM-L6-v2", inputs="Machine learning is fascinating" ) ``` #### 2. Model Downloads ```python from transformers import AutoModelForCausalLM, AutoTokenizer # Download and cache models locally model_name = "microsoft/DialoGPT-medium" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name) # Use locally inputs = tokenizer.encode("Hello, how are you?", return_tensors="pt") outputs = model.generate(inputs, max_length=100) ``` #### 3. Dataset Access ```python from datasets import load_dataset # Load datasets easily dataset = load_dataset("squad", split="train") for example in dataset.take(5): print(f"Q: {example['question']}") print(f"A: {example['answers']['text'][0]}") ``` ### Configuration ```json { "mcpServers": { "huggingface": { "command": "npx", "args": ["-y", "@anthropic/mcp-huggingface"], "env": { "HF_TOKEN": "hf_your-token", "HF_HOME": "~/.cache/huggingface" } } } } ``` ### Use Cases **Model Discovery**: Find and test models for any ML task. **Rapid Prototyping**: Download pre-trained models for quick experimentation. **Dataset Preparation**: Access and preprocess datasets for training. The Hugging Face MCP Server brings the ML ecosystem to Antigravity.
{
"mcpServers": {
"huggingface": {
"mcpServers": {
"huggingface": {
"env": {
"HF_TOKEN": "YOUR_HUGGINGFACE_TOKEN"
},
"args": [
"hf-mcp-server"
],
"command": "uvx"
}
}
}
}
}