A while back there was some debate about the Linux kernel dropping support for some very old GPUs. (I can’t remember the exact models, but they were roughly from the late 90’s)
It spurred a lot of discussion on how many years of hardware support is reasonable to expect.
I would like to hear y’alls views on this. What do you think is reasonable?
The fact that some people were mad that their 25 year old GPU wouldn’t be officially supported by the latest Linux kernel seemed pretty silly to me. At that point, the machine is a vintage piece of tech history. Valuable in its own right, and very cool to keep alive, but I don’t think it’s unreasonable for the devs to drop it after two and a half decades.
I think for me, a 10 year minimum seems reasonable.
And obviously, much of this work is for little to no pay, so love and gratitude to all the devs that help keep this incredible community and ecosystem alive!
And don’t forget to Pay for your free software!!!
As long as someone is willing and able to maintain it.
It’s open source. All the work is either done by volunteers or by corporate sponsors. If it’s worth it for you to keep a GPU from the 90s running on modern kernels and you can submit patches to keep up with API changes, then no reason to remove it. The problem isn’t that the hardware is old, it’s that people don’t have the time to do the maintenance
However, when it comes to any proprietary hardware/software the solution is simple. All companies should be required by law to open source all software and drivers, regardless of age, when the discontinued support; including server side code if the product is dependent on one (massive for gaming).
Don’t disagree with you, but yeah - good luck with that
it’s not that wild of a concept, it’s basically just an extension of how copyright and patents expire. You should have to prove that your IP is actually in use for it to remain valid, otherwise you forfeit it. Honestly moreso to prevent patent/copyright trolling than for right to repair reasons.
What do you think is reasonable?
As long as possible unless nobody uses it for cases that need any security (daily driver, server, enterprise etc). If you drop support, you are lazy and support ewaste creation. In some cases it can be too difficult to support it but “too difficult” has a lot of meanings most of which are wrong.
I think for me, a 10 year minimum seems reasonable.
That’s really not enough. GTX 1080 is an almost 10 years old card but it’s still very competitive. Most of my friends even use 750s or similar age hardware. And for software, any major updates just make it more enshittificated now lol.
In principal I don’t disagree.
Problem is supporting everything requires work and effort which isn’t funded by a corporation or anything
perhaps we should start building things with long term support in mind, and not just churn out the cheapest shit we can manage.
Like just look at modern laptops, most of them are absolute dogshit in terms of repairability and then you have the framework which you can straight up buy as a kit to assemble yourself.
Making things easy to maintain is clearly doable, not even that hard.I am literally talking about software support for legacy hardware. Not the hardware itself
@Swedneck @breadsmasher It’s wild how most modern laptops are a nightmare for repairs. Framework, feels like a breath of fresh air. Being able to buy it as a kit and put it together yourself is just so cool !! It shows that making things easy to maintain is not only possibe, but it’s not even that difficult…
Hardware support is usually funded enough or has enough human resources for it not to be a big problem imo. It’s ok to drop 30 years old stuff that nobody uses but dropping something just because rich people have a few years newer hardware is bad.
Yeah entirely missing my point.
I think it should be supported for a decade and the open sourced so that it can be archived and maintained by those who care.
I feel like with libre/open source software, this is a lot less of a problem – So long as it is still possible to add it back by messing around under the hood, we are pretty much fine with the “Main” branch of some software dropping legacy support?
It’d be unreasonable to expect the devs of anything to keep supporting things that are over 20 years old.
And like, if you’re using 25 year old kit at this point you’re either a hobbyist collector of vintage stuff, OR an enterprise with mission-critical assets on old legacy hardware/software – In either of those scenarios, “figure out how to go under the hood and fix stuff” (or in the enterprise’s case, “hire someone who does that for you”) is not an unreasonable expectation to have.
The smelly part is of course proprietary software and hardware, where “dropped official support” might as well be the signing of a death order. We desperately need a “right to repair and maintenance” regulation on every country in the world.
I’ll add that at this point, if you’re a hobbyist collector of vintage computer hardware, and you find satisfaction at making that old Compy 386 run like it’s modern hardware, you should know how to compile your own kernel.
Like, it just seems prudent, given the fact that it’s unreasonable to expect a “universal” kernel to simply grow and never prune anything (which I think avoiding having a giant kernel was part of the rationale, iirc), and there’s plenty of documentation out there on how to do it. If you aren’t going to run the same hardware as 95% of your peers, it’s your responsibility to make sure your hardware works.
Yeah I mean
Hobbyist collectors of typewriters (I know because my father is one) and cars (one of my friends is one) all have to learn how to maintain and service their own stuff because businesses that did that for them have all but disappeared. It’s considered part and parcel of the hobby.
It’d be nuts to expect it to be any different for computer collectors. Compile your own kernels, diagnose your own problems, fix your own shit. That’s what you do for a hobby. :P
If you’re running something that old, then it is by choice anyway, hardware gets more expensive after a certain age, and you definitely won’t be getting a (functional) 90s computer for cheap.
My current laptop is 9 years old, I recently replaced the heat paste and added new RAM. It should definitely be more than 10 years, as my laptop is totally usable for everyday tasks like
- playing music
- playing movies
- browsing the web
- Org-mode
My current laptop is 7 years old, and I Love It!
I still even play games with it. Not the newest stuff, but I have such a huge backlog of indies and not-so-new games that I could play for 15 years…
If someone told me this will be garbage in 3 years… I would hit them with the laptop. It’s a T470p, their skull is the part that would break.
As long as possible, as long as someone is using it, as long as someone can keep maintaining it.
If the main developer team can no longer maintain it then open-source it, put it in the public domain and set it free. Ditto for firmware and hardware documentation.
Companies oughta be forced to release all information they have on hardware they no longer maintain and disable any vendor-lock crap once warranty ends.
Yes hardware gets old and in the computer realm it usually means it’s rendered obsolete, but that doesn’t mean it doesn’t have its uses.
I’d say more than 10 years now. Computers evolved a lot more between the 90s and the 00s than between the 00s and now, my old laptop is 10 years old and it’s still perfectly running linux, and I hope it will keep running for years.
The problem is more hardware obsolescence, it’s a Acer so every part of it is slowly falling apart (keyboard, screen, battery) and OEM parts are impossible to find after all those years. I guess this problem is less important for desktop.
@Courantdair @Lettuceeatlettuce
Running an Acer laptop that is 9-10 years old. Everything works fine, but last summer my harddrive crashed. 8 gb ram and 2 ssd disks on 2tb each. And it is running smooth. I also have no plans on getting a new computer.
I think it is the kids who play hardware consuming games who drive the evolution of computers.
Important to note how the design of the hardware is important in enabling this, if you don’t have replaceable parts then the entire thing dies when one component does.
It’s really the one major complaint i have about my pixel 3, the lack of an SD card slot immediately puts a bit of a lifespan on the entire device.
@Courantdair @Lettuceeatlettuce
Yeah one reason I’ve never cared for laptops.
I snipe used USFF ultra small form factor machines off eBay…nice being able to eventually scale into new upgrades, swap out original i3’s, for i5 or i7 when I see good deals.
Although my main PC, the mechanical keyboard started to have issues…parts just arrived today so I can repair it \0/ glad I’m not roped into some laptop to try to maintain again.
I do not think that can be determined in the tech space with ‘age’ alone. Popularity, usability and performance are much more important factors.
It was already brought up in another comment, the gtx 1000th gen, is a nice example. The gtx 1080 is after 8 years still a valid GPU to use in gaming and the 1050 a nice little efficient cheap video encode engine which supports almost all modern widespread codecs and settings (except AV1).
I agree with this point: age isn’t the measure of usefulness, popularity is
Something might be 10yrs old and uaed by many people… and also something 10 months old is no longer used.
Also, just a thought, if it’s “old” it’s probably a standard too, so probably doesn’t actually need much (relative term) effort to maintain…
i use 10 year old hardware and its pretty capable on linux
we reached a point of diminishing returns in the advance of this technology
10 years is clearly not enough. I’d say 20 years but I clearly don’t know how much work is involved.
I also clearly think that preserving the history of technology isn’t given enough importance with games disappearing, OS’s being not useable anymore and stuff like this.
But Linux is clearly the good student here.
Linux is absolutely the gold standard when it comes to supporting legacy stuff.
With Windows trailing behind. At least Microsoft tries to support stuff from older versions of Windows, whereas Apple just says “**** you” every few years.
Devil’s Advocate: just because something can be preserved doesn’t mean it’s worth preserving. For all the golden games of the 80s and 90s, there were even more turds, and the same goes for other software.
Really though, the issue comes down to kernel bloat and volunteer support. Imagine having a kernel that’s bigger than Ubuntu, simply because it supports decades of hardware that only a shrinking, tiny minority uses, and an equally shrinking number of people who care to try to patch it so it stays up to date. It’s untenable.
I think you might have a different understanding of support than most. Nobody’s saying that the code to run this 30-year-old hardware should be enabled by default nor that distros should have them included by default.
That’s very different from whether the code is in the kernel in case someone wants to compile a custom kernel that does support it. Source code that’s disabled doesn’t add bloat to running systems.
I would say for as long as the hardware remains useful. A high end laptop may still be perfectly usable in 15 years if the hardware doesn’t fail by then.
Still using a 5 year old laptop with no degradation in performance and expecting at least another 5. All I had to do was uninstall some malware that was eating up all the system resources and popping up a bunch of ads. It was called Windows. :-D
Hardware and Software free from capitalism’s planned obsolescence will live as long as the community has interest.
reminder that the voyager 1 probe is still functional
The thing is, Linux always gets touted as the way to save old hardware. Win 11 not supporting a bunch of perfectly good older computers is leading to a massive e-waste wave. I understand that kernel devs mostly do it for free, and resources are limited for maintaining support for hardware few use anymore, but I think having a way to viably daily drive old hardware is really important for reducing e-waste and also just saving people’s money. I don’t like buying new tech unless it’s to replace something beyond repair—ie not just an upgrade for the sake of upgrading.
Obviously the problem is more socially systemic than just the decisions of Linux devs. I think the release cycle of new hardware is way too quick—if it were slower obviously that would reduce the workload for kernel devs, so hardware could be supported for longer (as they have less new hardware to work on supporting). And more generally we should have a mode of production not centred around profit, so that people don’t get punished (as in, they’re not getting paid but could be compensated for their time if they worked on something else) for spending time developing kernel support for old hardware.
There’s a good argument for more modular kernels (microkernels and such). That way the driver could be kept going for decades, only updating the IPC protocol as the microkernel changes through time
Isn’t the linux kernel modular already? It does has modules… which drivers can be, although they tend to be in-kernel.
support should drop when we hit the sweat spot where the energy saving from running modern RISC devices over old CISC outweighs the energy cost of manufacturing replacements.
Usually, my computers dropped in performance after around 10 years. They might contain parts that are a few years older by that time. So, to be able to use them further, I would suggest a minimum of 15 years.
Good point. If I know it’ll meet my needs, I’m sometimes inclined to buy tech that’s a few years old, especially if the newer version just adds cloud, AI, or something else I don’t want/need. In many cases it’s still marketed the same so I think end of support dates should be clearly marked on the product itself so the consumer can make an informed choice. Intentionally bricking a device should be treated as littering and the company should be responsible for disposal fees.
Linux is a different story because of the volunteer presence. If anything Linux should get subsidies for keeping e-waste out of landfills after the manufacturer has long abandoned the product.
My laptop is about 5 years old now and still runs as fast as the day I bought it, if not faster. I replaced the battery twice, but this thing could go another 5-10 years if I don’t drop it or spill something on it.
If anything Linux should get subsidies for keeping e-waste out of landfills
Great idea.
I think kde might be working on something like that, but Idk.