bifrost

maximhq/bifrost
★ 2,919 stars Go 🤖 AI/LLM Updated today
Fastest enterprise AI gateway (50x faster than LiteLLM) with adaptive load balancer, cluster mode, guardrails, 1000+ models support & <100 µs overhead at 5k RPS.
View on GitHub →

Topics

ai-gatewaygatewaygateway-servicesgenerative-aiguardrailsllmllm-costllm-gatewayllm-observabilityllmopsload-balancingmcp-clientmcp-gatewaymcp-servermodel-router