Linux: A solo developer is attempting to clean up 30 years of mess

Condobloke

Well-Known Member
Joined
Apr 30, 2017
Messages
8,365
Reaction score
6,790
Credits
55,814

Eliminating clutter from the Linux code could make it faster

3uZpCmHfvZYmqWoB6YoQVd-320-80.jpg


A senior Linux developer believes the platform can be a lot faster and more efficient - if its source code was lighter.

To make this happen, Ingo Molnar has announced the “Fast Kernel Headers” project, an attempt to clean up and rework Linux kernel's header hierarchy and header dependencies.

Linux apparently contains around 10,000 main .h header files with the include/ and arch/*/include hierarchies. Molnar says that over the years, these have “grown into a complicated & painful set of cross-dependencies we are affectionately calling 'Dependency Hell'."
 


 
I think the browsers have become slower due to all of the necessary security which is needed nowadays.

I can't say if Linux has gotten slower as I haven't used Linux long enough.

I can say certain Linux distros come with a lot of installed software that most user's probably will never need or ever use and I understand why.
 
It seems very odd that this should be a one-person operation. You would think it should be an ongoing team effort.

Does anyone know how many hundreds of developers work on the Linux kernel currently?
Then think of all the thousands of packages that depend of the kernel and it's libraries.
It would take a hundred years for a single person to re-write all of that.

A senior Linux developer believes the platform can be a lot faster and more efficient - if its source code was lighter.

Probably true, but I have heard this dozens of times. Everytime a "new and improved OS" comes out.
This was the promise behind NextOS, and even android originally. But then you have to integrate with so much stuff.
you have to add security stuff. by the time you get done with all that.. it's just as big as the other OS's.
So I admit I skeptical.
 
@arochester said..... It seems very odd that this should be a one-person operation. You would think it should be an ongoing team effort.

I agree

What then is this one person's motive ?
 
I think it has gotten a bit slower in general. I don't think it's too bad, but I'm all for more speed! :D
More Power to him.

As far as being a solo project. All code presented for inclusion of the OS is supposed to be vetted by others members of the team. I do have to agree that it should be an on going process.

Just my 2¢
 
Probably true, but I have heard this dozens of times. Everytime a "new and improved OS" comes out.
This was the promise behind NextOS, and even android originally. But then you have to integrate with so much stuff.
you have to add security stuff. by the time you get done with all that.. it's just as big as the other OS's.
So I admit I skeptical.

Now that I know that, that's a good counter-argument
 
I can say certain Linux distros come with a lot of installed software that most user's probably will never need or ever use and I understand why.

Correct me if I'm wrong, but I believe up until the late 80's/ early 90's, every computer program came on 8 inch floppies, rather than coming pre-installed on the computer itself, and you had to insert it into the disk drive in order to use it
 
I can say certain Linux distros come with a lot of installed software that most user's probably will never need or ever use and I understand why.
Correct me if I'm wrong, but I believe up until the late 80's/ early 90's, every computer program came on 8 inch floppies, rather than coming pre-installed on the computer itself, and you had to insert it into the disk drive in order to use it
I remember the 8 inch floppies and 5-1/4 inch floppies used for storage and software etc.

My point was there's so much additional software installed by default that I and many other users will never even use or need or look at or open.

That being the case should not be installed by default as what users want and use should be installed by the user when and if needed.
 
Correct me if I'm wrong, but I believe up until the late 80's/ early 90's, every computer program came on 8 inch floppies, rather than coming pre-installed on the computer itself, and you had to insert it into the disk drive in order to use it

Actually some came in a book and we're painstakingly keyed in by my dad typing with two fingers. And it wasn't just those big silly actually-floppy disks (we used those in school). My spectrum took little (hard) floppy discs and also tapes. I mostly got games on tapes. They sounded like a fax machine (or modem) when played back.
 
What can be faster than like half a second of startup for applications and about 3 seconds to boot from zero to a fully useable desktop environment? Or another half a second to close all apps and shutdown? :D
 
What can be faster than like half a second of startup for applications and about 3 seconds to boot from zero to a fully useable desktop environment? Or another half a second to close all apps and shutdown?

Back in the old days we had Commodore 64's, they had the OS in the ROM. The whole system only had 64k of RAM and an 8 bit CPU. But there was no wait. The OS was loaded immediately when you turned the computer on. :D

The down-side was... you couldn't update the OS. But hackers couldn't either. :D
 
8" floppies., those were the days! LOL I was programing NC (punch tape) mills and lathes in the early '80s. The computer had two 8" single side, single density floppy drives. They held a whopping 160Kb.

The mill would handle parts up to 20'x16' the lathe cut up to 22"max dia, 16"(over the cross slide) and 16' long. We also had a CNC Bridgeport mill for the smaller stuff.

The only way I could check the cutter path before running the job was on an 8 1/2 by 11" HP pen plotter!

The scary part was, I had to run the job once it was programmed!
Sorry for the off topic dribble.
 
A lot of projects/initiatives start with just a single person. The goal is, or at least the hope is, that more people will share your vision AND be willing to lend a hand.

The kernel itself was once a one-person project.
 
What can be faster than like half a second of startup for applications and about 3 seconds to boot from zero to a fully useable desktop environment? Or another half a second to close all apps and shutdown? :D

I suspect that those speeds are as a result of advances in hardware.....ssd'/m.2 drives etc....not so much because of improved Linux Kernel headers and the like.
 
I suspect that those speeds are as a result of advances in hardware.....ssd'/m.2 drives etc....not so much because of improved Linux Kernel headers and the like.
If I were still using Mint, you would be right. Mint loads considerably faster from an SSD (15 seconds from an HDD, 6 seconds from an SSD). But I have Arch for 2.5 years. Once I had it installed on a HDD (for a short period) and these speeds were the same as the speeds when it's on an SSD.

While startup faster than 3 seconds does sound appealing, which this developer is trying to do, I doubt that it's possible, unless the user has a supercomputer.
 

Members online


Latest posts

Top