★ 97 stars
Rust
🤖 AI/LLM
Updated today
Hook-based token compressor for 5 AI CLI hosts (Claude Code, Copilot CLI, OpenCode, Gemini CLI, Codex CLI). Up to 95% bash compression, signature-mode for code reads, cross-call dedup, MCP server, self-teaching protocol. Zero runtime deps.
View on GitHub →
Quick Install
Copy the config for your editor. Some servers may need additional setup — check the README.
Add to claude_desktop_config.json:
{
"mcpServers": {
"squeez": {
"command": "cargo",
"args": [
"run",
"--",
"squeez"
]
}
}
}
Run in terminal:
claude mcp add squeez cargo run -- squeez
Add to .cursor/mcp.json:
{
"mcpServers": {
"squeez": {
"command": "cargo",
"args": [
"run",
"--",
"squeez"
]
}
}
}
Topics
ai-clibash-hookclaude-codecodex-clicontext-engineeringcontext-windowcopilot-cligemini-clillmllm-toolsmcp-serveropencoderustsession-memorysignature-extraction