· 6 min read ·

The Software Is Why Your Hardware Feels Old

Source: lobsters

Alex Chan’s recent piece on dreaming of a ten-year computer puts words to something a lot of people have felt without articulating it clearly: the hardware in a five-year-old laptop is usually fine, and the reasons we feel compelled to replace it are almost entirely software-driven.

This is worth sitting with for a moment, because the framing matters. When we talk about computers “wearing out,” we tend to conflate a few distinct problems that have different causes and different solutions.

What Actually Ends a Computer’s Useful Life

Physical hardware failure is a real thing, but it accounts for far fewer premature retirements than software churn does. Batteries degrade, SSDs have finite write cycles, thermal paste dries out. These are engineering constraints. Most of them are also fixable, given access to parts and the willingness to spend an afternoon.

The Framework Laptop was built on exactly this premise. It launched in 2021 with a modular mainboard design specifically so users could swap CPUs, upgrade RAM, and replace any component without buying a new machine. Framework has since shipped multiple mainboard generations and maintained backward compatibility with the original chassis. That’s a deliberate engineering and business decision, not a natural state of affairs.

For most laptops, though, the physical repairability story is grim. Apple’s M-series MacBook Pros solder RAM and storage directly to the SoC die to achieve the memory bandwidth that makes Apple Silicon fast. You cannot upgrade either after purchase. iFixit consistently gives recent MacBook Pros repairability scores of 1 or 2 out of 10. This is a deliberate trade-off: higher performance and thinner profiles in exchange for a finite hardware lifespan.

But even that trade-off would be acceptable if the software side held up its end. It doesn’t.

The OS Support Cliff

Apple typically supports Macs for about seven years from original release. After that, macOS updates stop arriving. Security patches may continue slightly longer, but the machine is effectively on borrowed time. A 2017 MacBook Pro dropped from macOS Ventura support in 2022. The hardware still works perfectly. The CPU is a quad-core Intel Core i7, capable of compiling code, running Docker containers, editing video. The reason to replace it isn’t the machine; it’s that the ecosystem has moved on and taken its security patches with it.

Microsoft’s situation is different but comparably frustrating. Windows 11 introduced a hard TPM 2.0 requirement that locked out a significant percentage of otherwise capable hardware. Microsoft’s stated reason was security improvement through hardware attestation. The practical result was that a 2018 Dell with a perfectly functional i7-8700 processor could not officially run Windows 11. Windows 10 reaches end of life in October 2025, creating a forced migration cliff for millions of machines.

These decisions are not arbitrary. OS vendors argue that dropping older hardware lets them use modern CPU features, improve security foundations, and stop testing against an ever-expanding matrix of hardware configurations. These are real engineering considerations. They are also convenient business decisions for companies whose revenue depends on hardware sales.

Software Bloat Is Its Own Obsolescence Vector

Even within its supported window, a machine can be rendered functionally obsolete by software that grows faster than hardware ages. Electron applications deserve specific mention here. Slack’s desktop client is an Electron wrapper around a web application. It uses somewhere between 300MB and 1GB of RAM in typical use, depending on workspace size and how many channels you have open. In 2016, that would have consumed half the RAM on a mid-range laptop. Electron’s architecture means each application bundles its own copy of Chromium, adding roughly 200MB of disk overhead and a full browser rendering engine to what is functionally a chat client.

This pattern appears across the developer tooling ecosystem. VS Code is Electron. Discord is Electron. Figma’s desktop app is Electron. Each one is individually defensible. Collectively, they create a baseline memory pressure that makes 8GB of RAM feel inadequate on a machine where 8GB was plenty four years ago.

Browsers compound this. Chrome’s memory usage per tab has grown substantially over the last decade as web pages have grown more complex and JavaScript heavier. A browser session with twenty tabs open can easily consume 3-4GB. The hardware hasn’t changed; the software expectations have.

Linux as the Escape Hatch

The clearest path to a ten-year computer today runs through Linux. Debian’s long-term support policy provides security updates for five years after a stable release, with extended LTS coverage pushing that to ten years for some packages. A machine running Debian stable receives security patches long after Apple or Microsoft would have abandoned it.

more importantly, Linux distributions like Debian, Void, and Alpine are genuinely lightweight. A fresh Debian install with XFCE runs comfortably in 512MB of RAM. A 2012 ThinkPad X230, which costs around $50-100 used, is a productive development machine running Debian. It supports 16GB of RAM via standard SO-DIMMs. Its battery is user-replaceable. Its keyboard is, by most accounts, better than anything shipping on current ThinkPad consumer lines. It will continue receiving security patches for years.

This isn’t nostalgia for old hardware. It’s a recognition that the software stack determines lifespan more than the transistors do.

The catch is that Linux requires either luck (hardware that works out of the box) or effort (driver troubleshooting, configuration). For mainstream consumers, it remains a niche path. The compatibility situation has improved substantially, particularly for common ThinkPad and Dell Latitude hardware, but the gap from a managed macOS or Windows experience to a self-maintained Linux system is real.

What Ten Years Actually Requires

A genuine ten-year computer is achievable today in hardware terms. A Framework 13 purchased now should remain physically capable a decade from now. The mainboard swap path exists if you want more CPU performance. The storage and RAM are user-upgradeable.

The software side requires commitments that the industry has not been willing to make at scale:

OS vendors would need to decouple security updates from feature releases and extend support windows beyond five to seven years. The EU’s Right to Repair directive, which came into force in 2024, covers physical repair access but does not mandate software support longevity. That remains unregulated.

Application developers would need to treat memory and CPU usage as costs rather than infinite resources. Electron is a productivity choice that externalizes its cost onto users’ hardware. Native applications using platform APIs are harder to write and harder to ship cross-platform, but they consume dramatically fewer resources. A native macOS application written in Swift using AppKit will use a fraction of the memory that its Electron equivalent would.

Hardware manufacturers would need to prioritize repairability and component availability. This conflicts with thinness, with integration (Apple’s Unified Memory Architecture trades upgradability for bandwidth), and with the replacement revenue that spare-parts sales generate.

None of these are impossible. The Open Source Hardware Association certifies designs that meet repairability and documentation standards. Framework has demonstrated a commercial market for repairable laptops. Debian has maintained decade-long software support as a policy for years. These are existence proofs, not dreams.

The Environmental Argument That Should Win

The most underweighted factor in this conversation is environmental cost. Manufacturing a laptop generates a substantial carbon footprint before it ever ships. The Dell Latitude 7400’s lifecycle analysis puts embodied carbon at roughly 300kg CO2e, the majority of it in manufacturing. Extending a laptop’s useful life from four years to ten reduces that manufacturing burden by more than half on a per-year basis.

This argument is strong enough that it should drive policy. Software support timelines that force hardware replacement are, at scale, an environmental externality. The electricity saved by a more efficient new processor does not offset the manufacturing footprint of replacing working hardware prematurely.

The ten-year computer is a reasonable ask. The hardware is ready. The open-source software ecosystem can deliver it. What’s missing is the mainstream commercial will to treat longevity as a feature rather than a bug.

Was this interesting?