token-proctor

navintkr/token-proctor
★ 0 stars TypeScript 💻 Code/Dev Tools Updated today
VS Code extension + MCP server that validates prompts, routes to the cheapest LLM, and projects token × turn cost before the call. Cuts Copilot premium-request burn on agent loops.
View on GitHub →

Quick Install

Copy the config for your editor. Some servers may need additional setup — check the README.

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "token-proctor": {
      "command": "npx",
      "args": [
        "-y",
        "navintkr/token-proctor"
      ]
    }
  }
}

README Excerpt

> Pick the right model. Validate the prompt. See the real cost — tokens **and** turns. Before you spend a premium request. <p align="center"> <video src="https://github.com/navintkr/token-proctor/raw/main/docs/token-proctor.mp4" controls muted width="720"> Your browser doesn't render embedded video. <a href="docs/token-proctor.mp4">Watch the demo (MP4)</a>.

Tools (9)

analyze_promptbalancedestimate_costget_policylist_modelsrecommend_modelredact_textturnsvalidate_prompt

Topics

ai-agentscopilotcost-optimizationgithub-copilotgithub-copilot-costgithub-copilot-token-controlllmllm-costmcpmodel-routingtoken-countingvscode-extension