★ 441 stars
Rust
🤖 AI/LLM
Updated today
Enterprise AI bastion host for secure AI API and MCP access, with unified proxying, RBAC, audit logs, rate limiting, and cost tracking across OpenAI, Anthropic, Gemini, and self-hosted LLMs.
View on GitHub →
Quick Install
Copy the config for your editor. Some servers may need additional setup — check the README.
Claude Desktop
Claude Code
Cursor
Add to claude_desktop_config.json:
{
"mcpServers": {
"thinkwatch": {
"command": "cargo",
"args": [
"run",
"--",
"ThinkWatch"
]
}
}
}
📋 Copy
Run in terminal:
claude mcp add thinkwatch cargo run -- ThinkWatch
📋 Copy
Add to .cursor/mcp.json:
{
"mcpServers": {
"thinkwatch": {
"command": "cargo",
"args": [
"run",
"--",
"ThinkWatch"
]
}
}
}
📋 Copy
README Excerpt
<p align="center"> <picture> <source media="(prefers-color-scheme: dark)" srcset="assets/logo-dark.png"> <img src="assets/logo.png" alt="ThinkWatch" width="480"> </picture> </p> <p align="center"> <img src="https://img.shields.io/badge/Rust-000000?style=for-the-badge&logo=rust&logoColor=white" /> <img src="https://img.shields.io/badge/React-20232A?style=for-the-badge&logo=react&logoColor=61DAFB" />
Tools (6)
api_key mcp_server output_multiplier provider team user
Topics
ai ai-gateway ai-security ai-tools mcp mcp-gateway mcp-security mcp-server security