mcp_ollama_docker

rishivardhan/mcp_ollama_docker
★ 0 stars Python 🤖 AI/LLM Updated 1d ago
This is a **local Docker-based AI file assistant** that combines: - **Ollama** (local LLM runtime) running `qwen2.5-coder:3b` - **FastAPI** server exposing file tools and AI capabilities - **MCP-style tools** for reading, searching, editing, and managing files - **Internet search integration** for fresh information - **Safety mechani
View on GitHub →

Quick Install

Copy the config for your editor. Some servers may need additional setup — check the README.

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "mcp_ollama_docker": {
      "command": "uvx",
      "args": [
        "mcp-ollama-docker"
      ]
    }
  }
}

Or install with pip: pip install mcp-ollama-docker