proto
dex
MCP Server Index
Explore
Categories
GitHub
Submit Server
Home
/
AI/LLM
/ bifrost
bifrost
maximhq/bifrost
★ 2,919 stars
Go
🤖 AI/LLM
Updated today
Fastest enterprise AI gateway (50x faster than LiteLLM) with adaptive load balancer, cluster mode, guardrails, 1000+ models support & <100 µs overhead at 5k RPS.
View on GitHub →
Topics
ai-gateway
gateway
gateway-services
generative-ai
guardrails
llm
llm-cost
llm-gateway
llm-observability
llmops
load-balancing
mcp-client
mcp-gateway
mcp-server
model-router
Related AI/LLM Servers
gemini-cli
★ 97.7K
An open-source AI agent that brings the power of Gemini directly into your terminal.
context7
★ 49.0K
Context7 Platform -- Up-to-date code documentation for LLMs and AI code editors
LocalAI
★ 43.6K
The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement, ...
LibreChat
★ 34.6K
Enhanced ChatGPT Clone: Features Agents, MCP, DeepSeek, Anthropic, AWS, OpenAI, Responses API, Azure, Groq, o1,...