I think there is value is seeing how much you can do with limited hardware resources.
Hear me out.
Yes we can just keep adding more money and more resources, a bigger faster CPU, more RAM.. etc..
(To be honest that's typically what I do )
But there gets to be a tipping point, where given the same amount of resources, I can do more.
More spend more money if I don't have to? Admittedly 32-bit CPUs are getting long in the tooth.
But cars for example, have smaller engines on the whole than they did in the 70s. They are more powerful
now, some 6 cylinder cars make more power than 8 cylinder engines of the past. They get better gas
mileage and still make more power. Some do this at the expense of longevity. Super-charged and turbo-charged
engines usually have a shorter life span than naturally aspirated.
Now 32 bit support may not be around much longer, I don't know. Perhaps there will always be a niche market.
But what about embedded devices, smart phones, smart TV's, and smart refrigerators? Do they need huge
resources? Smart phones in particular always have battery length problems.
But Do I need a 32 core Ryzen or i9 with 128GB of RAM? What is the efficiency of the kernel and code
improved so much over the years, that I could do the same thing on an i5 CPU with 16GB of RAM?
This has happened to some extent. Code has gotten more efficient. But I remember when MacOS, WindowsXP,
and slack Linux fit on a single floppy disk. Now None of these fit on a CD-Rom anymore, some won't even fit
on a blu-ray disk. Do these OS's really do that much more? Or are they just bloated code from lazy programmers?
Having said all that, I see where IBM has a development version of a 128bit CPU. They are working with CERN and NASA
to develop a 3-D map of the milky-way galaxy with spatial data. Hundreds of billions of objects, with billions of billions
of vector points between the objects. This will have thousands of zettabytes of data. So we always need to move forward,
but do we need to do it at the cost of efficiency?