vscode-local-llm-

movieonlyemail4/vscode-local-llm-
★ 0 stars PHP 🤖 AI/LLM Updated today
Run local AI models in VS Code with automatic model detection, server start, and built-in MCP endpoint—no cloud or manual setup required.
View on GitHub →

Quick Install

Copy the config for your editor. Some servers may need additional setup — check the README.

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "vscode-local-llm-": {
      "command": "npx",
      "args": [
        "-y",
        "movieonlyemail4/vscode-local-llm-"
      ]
    }
  }
}

Topics

aichatcodecode-assistantggufgrammar-checkerllamallama-cppllamacppllmllm-servermcpollamaollama-apiopen-source