The local-first computing movement arose as a corrective to cloud dependence. Keep your data on your device, work offline, own your data and your software. All good things.
But local is constrained. Your local machine has limited storage and compute. You can't store all of the web locally. The hardware you can buy is only as capable as the state of the art today.
So practically speaking you cannot do everything locally that you might want. And for some operations, remote is actually preferred, even if you did have all the storage/compute/data/money in the world locally.
Consider AI models. You'd prefer to run them locally. Lower latency, no API costs, works offline, guaranteed privacy. But even if you could get the frontier labs to give you their model weights, your laptop certainly doesn't have the juice to run frontier models locally. And you probably don't want to buy (and carry around) a machine beefy enough to get close. Economically it makes sense for the inference to run in a hyperscaler's datacenter.
Consider backups. You could store your backups locally. A local backup protects against software failures. But if your computer blows up, your local backup blows up too. The value you actually want — protection from catastrophic loss — requires offsite storage. The best backup service is remote. Geographic separation is the feature.
Consider document collaboration. When you collaborate with someone else, they are remote to you. Local multiplayer — two people working on the same machine — is an edge case. Remote collaboration is what's wanted. It's inherently remote.
Local gives you real benefits — speed, offline access, control, privacy. But sometimes remote is a practical necessity, and sometimes it's the actual benefit.
2025-11-03