★ 2 stars
Python
🤖 AI/LLM
Updated 1d ago
MCP RAG server — local embeddings, your docs never leave your machine. Private knowledge base + web search for Claude, Cursor, and Ollama. Drop your docs, connect your AI client, done.
View on GitHub →
Quick Install
Copy the config for your editor. Some servers may need additional setup — check the README.
Claude Desktop
Claude Code
Cursor
Add to claude_desktop_config.json:
{
"mcpServers": {
"mcp-rag-server": {
"command": "uvx",
"args": [
"mcp-rag-server"
]
}
}
}
📋 Copy
Run in terminal:
claude mcp add mcp-rag-server uvx mcp-rag-server
📋 Copy
Add to .cursor/mcp.json:
{
"mcpServers": {
"mcp-rag-server": {
"command": "uvx",
"args": [
"mcp-rag-server"
]
}
}
}
📋 Copy
Or install with pip: pip install mcp-rag-server
Topics
claude cursor knowledge-base llm local-embeddings local-llm mcp mcp-server ollama privacy python qdrant rag retrieval-augmented-generation self-hosted