mcp-client-for-ollama

jonigl/mcp-client-for-ollama
★ 563 stars Python 🤖 AI/LLM Updated 1mo ago
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.
View on GitHub →

Quick Install

Copy the config for your editor. Some servers may need additional setup — check the README.

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "mcp-client-for-ollam": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ]
    }
  }
}

Or install with pip: pip install mcp-client-for-ollama

Topics

agentic-aiaicommand-line-toolgenerative-ailinuxllmlocal-llmmacosmcpmcp-clientmcp-servermodel-context-protocolollamaopen-sourcepypi-package