Data framework for LLM applications.
## LlamaIndex MCP Server: Data Framework for LLM Applications The **LlamaIndex MCP Server** integrates the leading data framework for LLM applications into Google Antigravity. LlamaIndex excels at connecting LLMs to any data source, making it essential for building production RAG and agent applications. ### Why LlamaIndex MCP? LlamaIndex specializes in data-aware LLM applications: - **Data Connectors**: 100+ integrations for data sources - **Advanced Indexing**: Multiple index types for different use cases - **Query Engines**: Sophisticated retrieval strategies - **Agent Framework**: Build autonomous AI agents - **Antigravity Native**: AI-assisted data integration ### Key Features #### 1. Simple Document Indexing ```python from llama_index.core import VectorStoreIndex, SimpleDirectoryReader # Load documents from directory documents = SimpleDirectoryReader("./data").load_data() # Create searchable index index = VectorStoreIndex.from_documents(documents) # Query the index query_engine = index.as_query_engine() response = query_engine.query("What are the main topics?") print(response) ``` #### 2. Advanced Retrieval ```python from llama_index.core.retrievers import ( VectorIndexRetriever, KeywordTableSimpleRetriever ) from llama_index.core.query_engine import RetrieverQueryEngine from llama_index.core.postprocessor import SimilarityPostprocessor # Hybrid retrieval with reranking retriever = VectorIndexRetriever(index, similarity_top_k=10) postprocessor = SimilarityPostprocessor(similarity_cutoff=0.7) query_engine = RetrieverQueryEngine( retriever=retriever, node_postprocessors=[postprocessor] ) ``` #### 3. Agent Building ```python from llama_index.agent.openai import OpenAIAgent from llama_index.core.tools import QueryEngineTool # Create tools from indices tools = [ QueryEngineTool.from_defaults( query_engine=docs_engine, name="documentation", description="Search product documentation" ), QueryEngineTool.from_defaults( query_engine=api_engine, name="api_reference", description="Search API reference" ) ] # Build agent with tools agent = OpenAIAgent.from_tools(tools, verbose=True) response = agent.chat("How do I authenticate API requests?") ``` ### Configuration ```json { "mcpServers": { "llamaindex": { "command": "npx", "args": ["-y", "@anthropic/mcp-llamaindex"], "env": { "OPENAI_API_KEY": "your-openai-key", "LLAMA_CLOUD_API_KEY": "your-llamacloud-key" } } } } ``` ### Use Cases **Knowledge Bases**: Build question-answering systems over private documents and databases. **Data Agents**: Create agents that can query multiple data sources to answer complex questions. **Semantic Search**: Implement advanced search over unstructured data with hybrid retrieval. The LlamaIndex MCP Server enables sophisticated data-aware LLM applications in Antigravity.
{
"mcpServers": {
"llamaindex": {}
}
}