mcp-evals

mclenhard/mcp-evals
★ 125 stars TypeScript 🤖 AI/LLM Updated 2mo ago
A Node.js package and GitHub Action for evaluating MCP (Model Context Protocol) tool implementations using LLM-based scoring. This helps ensure your MCP server's tools are working correctly and performing well.
View on GitHub →

Quick Install

Copy the config for your editor. Some servers may need additional setup — check the README.

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "mcp-evals": {
      "command": "npx",
      "args": [
        "-y",
        "mclenhard/mcp-evals"
      ]
    }
  }
}

Topics

aievalsmcp