Stop burning data bundles on AI tools.
Sandgrouse optimizes LLM API traffic so developers on metered connections get full AI power at a fraction of the data cost.
npx sandgrouse
Works with Claude Code, Cursor, ChatGPT, and any OpenAI-compatible tool.
A 1.2GB data bundle shouldn't disappear after two Claude Code prompts.
LLM coding tools send your entire conversation history, every file they've read, and an identical system prompt on every API request — and they send each request twice. A single prompt can trigger 50+ requests in a chain, each one larger than the last as context accumulates. Requests are 99% of your bandwidth, none of it is compressed, and the APIs won't accept it compressed either.
One proxy. Half the data. Full visibility.
Your AI tool (Claude Code, Cursor, etc.)
|
| Sends every request twice (~150KB each, growing)
v
Sandgrouse proxy (localhost:8080)
|
| Coalesces duplicates, compresses responses
| Shows you exactly where every byte goes
v
Cloud API (Anthropic, OpenAI)Everything runs locally. No data is sent anywhere except the original API destination.
Get started in 30 seconds.
Step 1: Install
npm install -g sandgrouse
Step 2: Start the proxy
sg start
Step 3: Point your AI tools at it
export ANTHROPIC_BASE_URL=http://localhost:8080
Also available via Homebrew (brew install sandgrouse), direct download, or npx sandgrouse.

Real-time bandwidth dashboard at localhost:8585
Request coalescing
Claude Code sends every request twice. Sandgrouse catches the duplicate and drops it. ~50% bandwidth reduction, instantly.
Bandwidth dashboard
See exactly how much data each session, each request, each tool consumes. Real-time at localhost:8585.
Response compression
gzip/brotli on API responses. Plus: context deduplication and delta encoding coming in v0.2.
Why “sandgrouse”
The sandgrouse is an African bird whose breast feathers absorb water like a sponge. Every morning, it flies up to 30km across the desert to a waterhole, soaks its feathers, and flies back to its chicks. The most efficient water transport system in nature. Efficient data transport across bandwidth-scarce environments. That's what this does.