Estimated Top 0.01–0.1% of Claude Users by Token Consumption

9,302,790,171

Total tokens processed by Bryan Leonard — 32x the Max plan

Total Tokens
Total Cost
Sessions
Days Active
Token usage across four simultaneous independent projects: EGC (consciousness research), LOLM (custom language model training), Codey (AI coding platform), and NFET (traffic optimization). Token counts are read from local Claude Code session logs. Cost figures are estimates based on published API pricing — not pulled from invoices. No institutional backing, no team, from Phoenix, Arizona. Data exported via ccusage, updated manually.

Daily Usage

Token consumption and cost by day

What $208/day of AI compute actually looks like

Model Breakdown

Usage by model

Opus for deep research, Sonnet for building, Haiku for speed

Cost Heatmap

Daily spend intensity

Less
More

Burn Rate Projection

At current pace, where is this heading?

At this rate, the projected annual spend exceeds $41,000

Project Breakdown

Where the tokens actually went

Four simultaneous projects, zero institutional backing

Monthly Totals

Aggregated by month

Token Flow

How tokens move through the system

Session Explorer

All sessions ranked by cost

Every individual Claude session logged

ProjectTokensCostModelsLast Active

How This Compares

Bryan's usage vs typical Claude Code users — sourced from public data

Where This Sits

Among the most documented heavy Claude users publicly known.

User Tokens Period Cost Source
Bryan Leonard 11.7B 45 days $8,323 est. API equivalent ccusage data
ksred (ksred.com) 10B 8 months ~$800 sub ($15K+ API equiv) ksred.com
Average developer ~180M 30 days ~$180 Anthropic docs
90th percentile dev ~360M 30 days ~$360 Anthropic docs
Multi-agent workflows variable per day $30–50/day r/ClaudeCode
Key distinction: All dollar amounts are estimated API-equivalent costs calculated by applying Anthropic's published per-token pricing to locally logged token counts. Bryan uses the API directly (not a Max subscription). The ksred comparison — the closest publicly documented case — used a $200/month Max subscription. Bryan's estimated $8,323 represents what this volume of tokens costs at published API rates. Bryan's rate is approximately 8x faster token consumption.

What This Paid For

EGC Research
44+ research subjects · Live empirical study · Pearson r=0.311 · Peer-reviewable dataset on consciousness and expression
LOLM
Custom language model · 10B-100B parameter target · Google TPU Research Cloud grant · Training infrastructure built from scratch
Codey
AI coding intelligence platform · Live on Render · Stripe billing integrated · 9 specification documents
NFET
Traffic optimization system · Kuramoto oscillator modeling · Live AZ-511 data feed · Phoenix metro area
How this data is collected: Token counts exported via ccusage, an open-source CLI tool that reads Claude Code's local JSONL session logs from ~/.claude/. Token counts are read directly from session logs and are accurate. Dollar amounts are estimates calculated by applying Anthropic's published per-token API pricing to those token counts — they are not pulled from actual invoices. Over 95% of tokens are cache reads ($0.50/M for Opus), which makes the cost estimates approximate but directionally accurate.
Sources: ksred.com · Anthropic docs · UsageBox · r/ClaudeCode