Almost every “cloud vs. local backup” article ends where it should begin. The writer picks a side, lists the obvious advantages, hand-waves the disadvantages, and publishes. You close the tab no closer to a decision.
The real question is not which tier is better. It is which failure modes you can tolerate for which data, and what the combination costs you in money, time, and attention. Cloud and local backup protect against different threats. They fail in different ways. A working strategy treats them as complementary, not competing.
This guide is a decision aid. It will not argue for any specific product. It will argue that most serious setups need both tiers, that the mix depends on your working data size and your upload bandwidth, and that “which one” is the wrong starting question. The complete guide to Mac backup in 2026 is the canonical overview this guide refines; the cost calculator runs the numbers for your library.
The four dimensions of the trade-off
Every backup decision lives on four axes, and thinking about them in isolation is where most arguments go wrong.
Speed. There are two speeds that matter, and they are not the same. Initial seed speed is how long the first full copy takes. Ongoing speed is how quickly each new snapshot completes. A 4 TB Lightroom library over a 40 Mbps residential uplink takes roughly ten days to seed to cloud storage; to a USB-C SSD it takes under two hours. Once seeded, incremental snapshots for a working photographer might be 20 to 80 GB per day, which is six to twelve hours over that same uplink and four to eight minutes locally. Speed is not just a comfort concern. If your ongoing deltas exceed what your uplink can push in a day, cloud-only is not viable without changing how you work.
Cost. Local storage has a one-time capital cost and an amortised hardware-replacement cost. Cloud storage is an operating cost forever, and cloud egress — the cost of downloading your own data back — is the line item that catches people out. We will walk through a concrete example later in this guide.
Recovery failure modes. Local drives fail. Houses burn. Laptops get stolen. Ransomware encrypts everything it can write to. Any one tier can be wiped out by the right kind of bad day. Tiers cover different bad days, and the overlap is the point.
Operational overhead. Every tier needs attention. External drives need to be plugged in, or the backup stops. Cloud backups need credentials rotated, retention reviewed, and restore paths tested. The right mix is the one you will actually maintain. A perfect system you abandon after six weeks is worse than an imperfect one you keep running.
When cloud wins
Cloud backup is the correct primary choice in a set of well-defined cases, and it is usually because the failure modes it covers are the ones local drives cannot.
Off-site protection is the whole point
The original argument for cloud backup has not changed: a fire, a flood, or a break-in takes everything in one room. If your laptop and your external drive are on the same desk, they share a single failure mode. Cloud backup puts a copy of your data in a building you will never visit, protected by infrastructure you will never see. That is the job. For anyone working from a single location — a home studio, a small office, a rented workspace — this alone is often enough to justify a cloud tier.
Ransomware resistance via immutable storage
Modern ransomware is not a smash-and-grab; it is a patient intruder that looks for connected backup drives and encrypts them before triggering. An external drive plugged into your Mac is, from the malware’s point of view, just another volume. Immutable cloud storage changes the game. When the backup engine writes a snapshot to an Object Lock–protected bucket in compliance mode, nothing — not the user, not the application, not the storage provider — can delete or overwrite it before its retention period elapses. An attacker who takes over your machine cannot reach back through the wire and shred your history. That property is difficult to reproduce locally without considerable discipline around power-cycling drives. On our managed cloud, immutability is not an opt-in. It is the default for every bucket, because a backup you can silently delete is not a backup.
Zero-knowledge by design
A well-built cloud backup is zero-knowledge: your data is encrypted on your Mac with keys you control, and the storage operator never sees plaintext. This is worth stating plainly because a lot of consumer cloud backup is not zero-knowledge. If the provider can read your files, so can a subpoena, so can a breach. End-to-end encryption with a passphrase only you hold is the line that separates a backup you trust from storage you rent.
Multi-device aggregation and roaming recovery
If you work across a studio iMac and a travelling MacBook, cloud is the tier that lets both devices restore from anywhere. Lose the laptop in a taxi; replace the hardware; sign in; the data is already waiting. A local drive sitting on your desk at home cannot do that. For teams, the argument sharpens: the cloud tier is the shared source of truth that survives any one machine’s death.
When local wins
Local backup is not the nostalgic tier. It is the tier that makes the other tiers practical, and for certain workloads it is the one that actually gets used.
Fast first seed and fast large-file restore
Seeding a 12 TB video archive to cloud storage over a domestic connection is a multi-week project. Seeding the same archive to a Thunderbolt 4 RAID takes an afternoon. The same asymmetry applies on the way back. A full bare-metal restore from cloud is bottlenecked by your download speed; from a USB-C SSD, a 500 GB project restores in under fifteen minutes at drive-bus speeds. For anyone working with video masters, RAW photo shoots, or large DAW sessions, the local tier is not optional — it is the only tier that meets a realistic RTO.
No upload-bandwidth dependency
Not every working environment has symmetric gigabit fibre. Rural starlink links, cable uplinks capped at 20 Mbps, hotel Wi-Fi, a cellular hotspot on a shoot day — any of these collapses a cloud-only strategy. Local backup depends on a cable, not a carrier. A 10 Gbps Thunderbolt link does not care that the ISP is throttling. For a videographer returning from a week on location with 2 TB of new footage, the first protective copy has to happen before the internet comes into it.
Bus-powered portability for travelling setups
For field work, a bus-powered NVMe drive the size of a stick of gum holds 4 TB. It fits in a laptop sleeve. It backs up today’s shoot over dinner, and then lives in a different bag from the laptop until you get home. That is a complete, valid, off-site-enough strategy for a one-person travel setup where cloud bandwidth is unreliable. The external drive chapter covers the mechanics. What matters here is the frame: local is not just “the tier at your desk.” Local is “the tier that fits in your pocket and is cheaper than lunch per terabyte.”
The “local, movable” pattern
A pair of rotated SSDs — one on-site, one at a friend’s house or a safe-deposit box — gives you most of what cloud gives you, for a flat one-time cost. It requires discipline. You have to actually rotate them, and most people do not. But if you will, and if your data volume makes cloud impractical, it is a legitimate pattern that predates cloud backup and still works.
The hybrid case — when you need both
For most creators and professionals, the honest answer is not cloud, and not local. It is both, with each tier doing what it is good at.
Local gives you speed. Cloud gives you survival. A good strategy is not a choice between them — it is a shape that uses each for what it does best.
The hybrid pattern works like this. Your working data lives on your Mac. A continuous local snapshot runs every hour to an external SSD sitting beside it, so that any “I just deleted the wrong folder” moment is a two-click restore with a sub-second RPO. A concurrent cloud snapshot streams the same changes to immutable off-site storage, so that any “the building caught fire” moment is recoverable — slower, but intact. The local tier handles accidental deletion, a failed drive, and rapid restoration of a corrupted project file. The cloud tier handles theft, fire, flood, ransomware, and the scenario where your local tier itself is what failed.
This is what the 3-2-1 rule, properly read, actually prescribes. Two different media, at least one off-site. Cloud plus local is the cleanest modern realisation of that rule, because it maps directly onto the two independent technologies — spinning disk or NVMe in your hand, versus S3-compatible object storage on the other end of the wire — that a single disaster cannot take out together.
The hybrid model also decouples restore speed from bandwidth. A 400 GB project restored from the local tier is available in minutes. The cloud tier is there for the bad days, not the everyday. You pay a modest cloud subscription not to restore from it weekly, but to know that the one time you genuinely need it, it is there. Insurance makes sense even when you do not claim.
Cost math (with a real example)
Consider a working photographer with a 2 TB Lightroom library, five years of snapshot history retained, and a modest delta of roughly 30 GB per week of new RAWs, edits, and exports.
The local side
A 4 TB external USB-C SSD with the reliability profile you want for a primary backup is around £240 in 2026, replaced roughly every four years on a conservative rotation. Amortised, that is £60 per year, or £5 per month. A second rotation drive — stored off-site at a friend’s or a family home — doubles that to £10 per month. Local protection, fully paid for, amortised, with no recurring service bill.
The cloud side, managed tier
At our managed cloud pricing, the same 2 TB library with five-year retention costs in the region of £14 per month. That fee includes Object Lock immutability, zero-knowledge encryption, and no egress charge on restore. No egress charge matters: a provider who charges you to download your own data in a crisis turns a bad day into an expensive bad day. Budget this as a flat line item.
The cloud side, bring-your-own-storage
At BYOS S3 rates, the storage itself is cheaper per terabyte — roughly £8 per month for the same 2 TB, depending on provider. BYOS is the right answer when you already have a storage account you trust, when your volume climbs into the 10+ TB range, or when your organisation mandates a specific region or vendor for compliance. At 2 TB the difference between managed and BYOS is a few pounds a month; the decision is usually not cost, it is where you want the operational boundary.
The combined picture
Local at £5 to £10 per month. Cloud at £8 to £14 per month. Total for a fully redundant, hybrid 2 TB setup: roughly £13 to £24 per month, plus a four-yearly drive replacement. For context, that is less than a single month of most RAW-to-JPEG subscription-software bundles, and it covers every realistic scenario short of simultaneously losing both your home and your cloud provider in the same week.
Scale this up. At 8 TB the managed tier climbs; at 20 TB BYOS pulls ahead decisively. The cost calculator lets you run your own numbers rather than take this example on faith.
Failure modes each tier covers, and doesn’t
Here is the direct table, because abstraction hides the point.
- Accidental deletion. Local covers it; cloud covers it. Both tiers keep snapshot history. The local tier restores faster.
- Single drive failure. Local does not cover this when the drive that failed is your backup. Cloud covers it completely. The drive failure guide goes into the mechanics.
- Mac hardware failure. Local covers it if the external drive is intact. Cloud covers it unconditionally, and lets you restore from a replacement machine anywhere.
- Theft of your Mac and drives together. Local does not cover it. Cloud covers it.
- Fire, flood, building-wide loss. Local with rotated off-site drives covers it, if you rotate. Cloud covers it by default.
- Ransomware that encrypts connected volumes. Local does not cover it unless the drive was physically disconnected when the attack ran. Immutable cloud covers it.
- Provider compromise or account takeover. Cloud does not fully cover this; local does. This is precisely why both tiers exist.
- Catastrophic human error — wiping the wrong machine. Snapshot history in both tiers covers it, because snapshots are immutable at the storage layer.
- Not testing your restores. Neither tier covers this. The restore you never ran is a restore that does not work. Schedule one.
The pattern is clear. No single tier is sufficient. Each tier is the other’s insurance.
A decision tree you can actually use
Written as conditionals, in roughly the order you should think through them.
-
If your working data is under 1 TB and you have fibre or cable with more than 40 Mbps upload, start cloud-only with a managed tier. Add a local tier later if your restore patterns demand faster recovery. This is the simplest possible setup, and for a writer, a software developer, or a non-video creator, it is often enough on its own for a year or more.
-
If your working data is between 1 and 4 TB, run both tiers from day one. Local for speed; cloud for resilience. The combined cost is low. The protection is effectively complete. This is the sweet spot for most photographers, designers, and music producers.
-
If your working data is over 4 TB, or if you routinely generate 100+ GB of deltas per day, start local-first. A Thunderbolt RAID or a NAS handles your daily snapshots; cloud handles a curated subset — final deliverables, masters, and irreplaceable originals — rather than every scratch file. BYOS S3 at TB volumes usually beats managed-tier cloud on price.
-
If your upload bandwidth is below 20 Mbps or unreliable, local is your primary tier. Cloud is a slow secondary tier for a curated subset of data. Seed the cloud tier once over a weekend or via a physical seed drive, then let incremental changes catch up.
-
If you work from multiple locations, or multiple Macs, cloud is non-negotiable. Local alone cannot give you roaming restore. Run local at your main location for speed; rely on cloud everywhere else.
-
If you are part of a team or a small business, cloud is the shared tier of record, with per-device local backup for each workstation. The cloud tier doubles as your operational continuity plan: a stolen laptop is a new laptop and a re-sign-in, not a lost project.
-
If you have not tested a restore in the last ninety days, stop reading and go test one now. The best strategy on paper is worse than a mediocre one with a verified restore path.
Closing
Cloud versus local is a false choice. The right frame is cloud and local, in a mix that matches your data and your bandwidth. For most people that is both tiers, running quietly, with the local tier doing the fast work and the cloud tier doing the durable work. If this guide has moved you from “which one” to “what shape,” it has done its job.
Start with the complete guide to Mac backup in 2026 for end-to-end setup, or jump to the cost calculator to run the numbers for your library. Either way, the best backup is the one you start today.