Python · MIT · v0.1.22
Drop agentcents between your agent and any LLM provider. It tracks every call, enforces budgets, caches responses, and tells you exactly where your money is going.
No API keys · No accounts · No code changes required
How it works
Install once. Works with Python ≥ 3.9.
Start the local proxy. Once per session.
One header change. No other code changes.
See exactly where every cent went.
Quick start
# Before import anthropic client = anthropic.Anthropic() # After — point at the proxy import anthropic client = anthropic.Anthropic( base_url="http://localhost:8082", default_headers={ "X-Agentcents-Target": "https://api.anthropic.com", "X-Agentcents-Tag": "my-agent", # optional }, ) # Everything else stays the same response = client.messages.create( model="claude-opus-4-6", max_tokens=1000, messages=[{"role": "user", "content": "Hello"}], )
CLI reporting
Pricing
Supported providers
Pricing syncs automatically on proxy startup from OpenRouter + LiteLLM. No manual updates needed.