Python · MIT · v0.1.22

Know exactly what your
agents are spending.

Drop agentcents between your agent and any LLM provider. It tracks every call, enforces budgets, caches responses, and tells you exactly where your money is going.

$pip install agentcents

No API keys · No accounts · No code changes required

How it works

A proxy that sits
in the middle.

Your
Agent
agentcents
localhost:8082
LLM
Provider
01
pip install agentcents

Install once. Works with Python ≥ 3.9.

02
agentcents start

Start the local proxy. Once per session.

03
point client at proxy

One header change. No other code changes.

04
agentcents usage

See exactly where every cent went.

Quick start

One base_url change.
That's it.

# Before
import anthropic
client = anthropic.Anthropic()

# After — point at the proxy
import anthropic
client = anthropic.Anthropic(
    base_url="http://localhost:8082",
    default_headers={
        "X-Agentcents-Target": "https://api.anthropic.com",
        "X-Agentcents-Tag":    "my-agent",  # optional
    },
)

# Everything else stays the same
response = client.messages.create(
    model="claude-opus-4-6",
    max_tokens=1000,
    messages=[{"role": "user", "content": "Hello"}],
)

CLI reporting

$ agentcents usage
─────────────────────────────────
Total spend (last 48h) $3.42
my-agent $2.10 61%
research-agent $1.32 39%
─────────────────────────────────
Daily budget $5.00
Remaining $1.58

$ agentcents recent
claude-opus-4-6 512 tok $0.008 my-agent
claude-sonnet-4-6 1.2k tok $0.003 research-agent
gpt-4o 800 tok $0.006 my-agent

Pricing

Free forever.
Pro for serious agents.

Free
$0
Open source, no account needed. Everything you need to get started tracking costs.
  • Proxy + cost logging
  • Exact-match cache
  • Budget alerts + hard block
  • Full CLI reporting
  • Web dashboard
  • Local Ollama tracking
pip install agentcents
Pro
Pay once
One-time license key. Activate on any machine with agentcents activate <key>.
  • Everything in Free
  • Semantic similarity cache
  • Multi-agent TUI dashboard
  • Live call watch
  • Model swap advisor
  • Auto-routing (swap mode)
  • XGBoost cost predictor
Get Pro →

Supported providers

Any provider that speaks
OpenAI format.

Anthropic
OpenAI
Google Gemini
OpenRouter
Groq
Ollama (local)

Pricing syncs automatically on proxy startup from OpenRouter + LiteLLM. No manual updates needed.