A while back there was some debate about the Linux kernel dropping support for some very old GPUs. (I can’t remember the exact models, but they were roughly from the late 90’s)
It spurred a lot of discussion on how many years of hardware support is reasonable to expect.
I would like to hear y’alls views on this. What do you think is reasonable?
The fact that some people were mad that their 25 year old GPU wouldn’t be officially supported by the latest Linux kernel seemed pretty silly to me. At that point, the machine is a vintage piece of tech history. Valuable in its own right, and very cool to keep alive, but I don’t think it’s unreasonable for the devs to drop it after two and a half decades.
I think for me, a 10 year minimum seems reasonable.
And obviously, much of this work is for little to no pay, so love and gratitude to all the devs that help keep this incredible community and ecosystem alive!
And don’t forget to Pay for your free software!!!
Devil’s Advocate: just because something can be preserved doesn’t mean it’s worth preserving. For all the golden games of the 80s and 90s, there were even more turds, and the same goes for other software.
Really though, the issue comes down to kernel bloat and volunteer support. Imagine having a kernel that’s bigger than Ubuntu, simply because it supports decades of hardware that only a shrinking, tiny minority uses, and an equally shrinking number of people who care to try to patch it so it stays up to date. It’s untenable.
I think you might have a different understanding of support than most. Nobody’s saying that the code to run this 30-year-old hardware should be enabled by default nor that distros should have them included by default.
That’s very different from whether the code is in the kernel in case someone wants to compile a custom kernel that does support it. Source code that’s disabled doesn’t add bloat to running systems.