The future of 32-bit linux support

 An interesting article was posted on LWN.net on linux support for 32-bit systems. https://lwn.net/SubscriberLink/1035727/4837b0d3dccf1cbb/ 

"Arnd Bergmann started his Open Source Summit Europe 2025 talk with a clear statement of position: 32-bit systems are obsolete when it comes to use in any sort of new products."

This statement doesn't really make a lot of sense for embedded developers that plan to use no more than 64-128MB. Systems that use only a tiny fraction of the linux kernel and drivers.

"Currently the kernel is able to support 32-bit systems with up to 16GB of installed memory. Such systems are exceedingly rare, though, and support for them will be going away soon. There are a few 4GB systems out there, including some Chromebooks. Systems with 2GB are a bit more common. Even these systems, he said, are "a bit silly" since the memory costs more than the CPU does. There are some use cases for such systems, though. Most 32-bit systems now have less than 1GB of installed memory. The kernel, soon, will not try to support systems with more than 4GB." 

This is another odd statement, which doesn't seem to really matter. Memory once cost much more (before the 2000s), so if it it costs more than the CPU, then it's not really a big deal if the premium is on something else, such as legacy support, or power consumption. 

Edited on 12/27/2025:

From: https://infosec.exchange/@JessTheUnstill/115786136251963231

"We'll just fork it" is privileged mindset. It means you think you can gather enough clout and like minded people to put up a bunch of unplanned work and time and passion to make a whole new project despite the old one still working "well enough".

The only problem with this understandable criticism is that 64 bit is in some ways considered a fork too, and the people on 32 bit aren't exactly the forkers.

When energy consumption isn't a statistically significant increase, it makes sense to use 64 bit addressing and architecture. But 64 bit on a 1GB RAM machine might use 100MB more for the same instructions (possibly more efficiently). I have an Atom N450 netbook from 2011 and I've run both 32 bit and 64 bit Windows 7 & 10. The idle RAM was lower on 32 bit.

The systems I'd like to develop for may be just 16-64MB. What happens when one needs to design a microcontroller or application processor for 32MB, and a 64 bit ISA now requires 40MB? If you're Apple and have $100 billion in liquid cash, you can afford a 2nm chip and that L2 cache is not much more power. But if you're a small startup, 32MB on 22nm RAM might barely use less than 40mW, and 40MB of 22nm RAM might use 50mW. That's going to result in fewer hours of battery life. Not quite butterfly effect in terms of dramatic appearance from afar, but up close it makes a lot more of a difference on IoT devices (and consumer electronics like laptops). 

TL/DR: We still need 32-bit to save ~10mW for at least the next three decades on some new chip designs' TDPs (unless you're Apple or Intel and are offering free/low cost access to their leading edge PDK) 

The same can be said of 8-bit and 16 bit microcontrollers, but at some point developers realized that a 16MB address space was too limiting, and 32 bit is more future proof. There will likely be new 32-bit microcontrollers in the year 2100 because it makes no sense to use an 8GB chip for a simple sensor or appliance. I realize you might be thinking, any startup will be able to afford a 2nm chip in 2100- and yes, that might be true, if there aren't other architectures adopted, such as quantum or photonics- but realizing carburetors are over 150 years old, still have a use at a small scale.

That might not be the best example, but the fact is that the pointers use more space, and in hard real time environments (RT), 8-bit and 16-bit microcontrollers with fewer instructions can get their operations completed faster, and that could benefit "time to complete" windows for scheduling tasks, since there would be less latency between loading new instruction caches.   

 

Sources: https://qr.ae/pC9Yse

                 https://www.stata.com/support/faqs/windows/32-bit-versus-64-bit-performance/

 "The question is “Is a 64-bit operating system twice as fast as a 32-bit OS?

No. It’s almost always slower if either OS can satisfy the same requirements on the same hardware - largely because storing memory references (pointers) will require twice as much memory. (Not all the work the CPU does involves storing and retrieving such memory addresses but a lot does.)

But a 64-bit operating system (on hardware with more than 4GB of RAM) solves so many more problems than a 32-bit OS can - and badly written active web pages running too much javascript in a browser is a big and perennial problem."

 Edit 1/3/2026:

Linux Developers who say everyone should upgrade to 64 bit (for every new build) is like Ford or GM saying they will stop manufacturing cars, SUVs, vans and pickup trucks because only an 18 wheeler can fit everyone or everything, except wind turbines. 


Comments