★ 2,919 stars
Go
🤖 AI/LLM
Updated 1mo ago
Fastest enterprise AI gateway (50x faster than LiteLLM) with adaptive load balancer, cluster mode, guardrails, 1000+ models support & <100 µs overhead at 5k RPS.
View on GitHub →
Quick Install
Copy the config for your editor. Some servers may need additional setup — check the README.
Claude Desktop
Claude Code
Cursor
Add to claude_desktop_config.json:
{
"mcpServers": {
"bifrost": {
"command": "go",
"args": [
"run",
"github.com/maximhq/bifrost@latest"
]
}
}
}
📋 Copy
Run in terminal:
claude mcp add bifrost go run github.com/maximhq/bifrost@latest
📋 Copy
Add to .cursor/mcp.json:
{
"mcpServers": {
"bifrost": {
"command": "go",
"args": [
"run",
"github.com/maximhq/bifrost@latest"
]
}
}
}
📋 Copy
Topics
ai-gateway gateway gateway-services generative-ai guardrails llm llm-cost llm-gateway llm-observability llmops load-balancing mcp-client mcp-gateway mcp-server model-router