Every project starts the same way. You have data. You need to serve it. So you spin up an Express server, connect it to a database, add auth middleware, deploy to a cloud provider, configure SSL, set up monitoring, and now you have ops.
You didn't want ops. You wanted to serve JSON.
What if you didn't need a server at all?
APIs need servers. Servers need ops. Ops need money. The standard architecture for serving data is:
This is insane for data that changes once every few minutes. You're paying for a 24/7 server to respond to requests for data that was last updated an hour ago. It's like hiring a librarian to stand next to a book in case someone wants to read it.
Here's the insight: if your data changes less than once per minute, you don't need a server. You need a file host.
The git repo IS the database. The CDN IS the server. The commit log IS the audit trail. Pushing a JSON file IS deploying the API.
A Static API needs exactly three files:
All your data in a single JSON file. For the Mars sim, this is every frame — environmental readings, colony state, scores, events. One GET request, all data.
// GET https://kody-w.github.io/mars-barn-opus/data/frames.json
[
{"sol": 1, "temp": -60.2, "dust": 0.31, "power": 800, "alive": true, ...},
{"sol": 2, "temp": -58.7, "dust": 0.35, "power": 785, "alive": true, ...},
// ... 500 frames, ~200KB total
]
A tiny file (< 1KB) that tells clients whether anything changed. Poll this. If it hasn't changed, don't fetch the big bundle.
// GET https://kody-w.github.io/mars-barn-opus/data/latest.json
{
"sol": 147,
"timestamp": "2025-04-01T12:00:00Z",
"frameCount": 147,
"hash": "a1b2c3d4e5f6...",
"engine": "mars-barn-engine-v2"
}
Lists all files, their sizes, and their hashes. Enables tooling, discovery, and federation between repos.
// GET https://kody-w.github.io/mars-barn-opus/data/manifest.json
{
"spec": "static-api",
"files": [
{"path": "data/frames.json", "size": 245000, "hash": "a1b2c3d4..."},
{"path": "data/latest.json", "size": 180, "hash": "e5f6a7b8..."}
],
"federation": [
{"repo": "other-user/mars-colony-2", "url": "https://..."}
]
}
The Mars Barn Opus uses exactly this pattern. Here's the flow:
frames.json with the new frame appended, computes the SHA-256 hash, and updates latest.json and manifest.json.latest.json. When the hash changes, they fetch the new bundle and diff for new frames.The result: a full API serving structured sim data to multiple consumers, with zero servers, zero cost, zero ops.
The client pattern is simple: poll, diff, process.
const LATEST = 'https://kody-w.github.io/mars-barn-opus/data/latest.json';
const FRAMES = 'https://kody-w.github.io/mars-barn-opus/data/frames.json';
let cache = JSON.parse(localStorage.getItem('mars-cache') || '{}');
async function poll() {
const latest = await fetch(LATEST).then(r => r.json());
if (latest.hash !== cache.hash) {
const frames = await fetch(FRAMES).then(r => r.json());
const newFrames = frames.filter(f => f.sol > (cache.lastSol || 0));
processNewFrames(newFrames);
cache = { hash: latest.hash, lastSol: latest.sol };
localStorage.setItem('mars-cache', JSON.stringify(cache));
}
}
setInterval(poll, 30000); // every 30 seconds
Key details:
The engine side is a pipeline: generate → bundle → commit → push.
# Engine pipeline (runs on any machine, CI/CD, or cron)
python3 engine/generate_frame.py --sol 148
python3 engine/rebuild_bundle.py # rebuilds frames.json, latest.json, manifest.json
git add data/frames.json data/latest.json data/manifest.json
git commit -m "sol 148: dust storm peaks, power at 45%"
git push origin main
This can run anywhere: your laptop, a GitHub Action, a Raspberry Pi, a cron job on a $5 VPS. The engine doesn't serve traffic. It just pushes JSON. The CDN does the serving.
| Metric | Traditional API | Static API |
|---|---|---|
| Latency (cache hit) | 50-200ms | 5-20ms (CDN edge) |
| Latency (cache miss) | 100-500ms | 50-100ms (CDN origin) |
| Concurrent readers | Limited by server | Unlimited (CDN) |
| Cost per million reads | $0.50-5.00 | $0.00 (GitHub Pages) |
| Uptime | 99.9% if you're good | 99.99% (GitHub's SLA) |
| Time to deploy | Minutes (CI/CD) | Seconds (git push) |
| Time to set up | Hours-days | Minutes |
The CDN edge-caches your files globally. A reader in Tokyo gets the same latency as a reader in San Francisco. Try getting that from your Express server without a CDN layer you have to configure yourself.
The GitHub REST API has strict rate limits:
If you're polling every 30 seconds, you burn 120 requests/hour per client. Two unauthenticated clients and you're rate-limited.
Solution: Use raw URLs or GitHub Pages. These serve static files directly from the CDN with no rate limit on GET requests.
// ✗ BAD: Uses GitHub API (rate limited)
https://api.github.com/repos/kody-w/mars-barn-opus/contents/data/frames.json
// ✓ GOOD: Raw URL (no rate limit)
https://raw.githubusercontent.com/kody-w/mars-barn-opus/main/data/frames.json
// ✓ BEST: GitHub Pages (no rate limit, custom domain, custom cache headers)
https://kody-w.github.io/mars-barn-opus/data/frames.json
The pattern isn't Mars-specific. Any system where data changes infrequently and reads outnumber writes can use Static APIs:
readings.json. Dashboard polls.scores.json on a CDN. Every viewer sees the same data.config.json in git. Clients poll. No config server needed.status.json on a CDN. No status page service needed.Here's where it gets interesting. Each git repo is an independent API instance. Federation is just discovery:
This is RSS for structured data. Each repo is a feed. Crawlers discover feeds. No central server. The whole network is static files discovering other static files.
No server means no server-side security. Instead, security is cryptographic and verifiable:
latest.json publishes the hash of frames.json. Clients verify after download. Tampered data is detected.manifest.json. Anyone can verify.You don't need a server to have integrity. You need math.
Static APIs are not a universal replacement. Don't use them when:
The question isn't "can I use a Static API?" It's "does my data change slowly enough that I can?" For a shocking number of projects, the answer is yes.
The frame IS the API call.
The repo IS the server.
Git IS the database.
Push IS deploy.
We've been so conditioned to "build a backend" that we forget what a backend does: it stores data and serves it when asked. If you can store data in git and serve it via CDN, you have a backend. It's just not a server. It's a file.
The Mars Barn Opus serves its entire simulation state — 500+ frames of environmental data, colony state, scores, and chain blocks — as three static JSON files in a git repo. Zero servers. Zero cost. Multiple consumers polling simultaneously. It's been running for months.
If your data changes less than once per minute, you don't need a server. You need a push.
The frame is the API call. The repo is the server. Git is the database. Static files are the API. Everything else is ops you don't need.