bifrost

maximhq/bifrost
★ 2,919 stars Go 🤖 AI/LLM Updated 1mo ago
Fastest enterprise AI gateway (50x faster than LiteLLM) with adaptive load balancer, cluster mode, guardrails, 1000+ models support & <100 µs overhead at 5k RPS.
View on GitHub →

Quick Install

Copy the config for your editor. Some servers may need additional setup — check the README.

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "bifrost": {
      "command": "go",
      "args": [
        "run",
        "github.com/maximhq/bifrost@latest"
      ]
    }
  }
}

Topics

ai-gatewaygatewaygateway-servicesgenerative-aiguardrailsllmllm-costllm-gatewayllm-observabilityllmopsload-balancingmcp-clientmcp-gatewaymcp-servermodel-router