Tagging on here: Both the first model PS3 and Xbox 360 were hot boxes with insufficient cooling. Both suffered from getting too hot too fast for their cooling solutions to keep up. Resulting in hardware stress that caused the chips solder points to weaken until they eventually cracked.
It got very hot and eventually stopped working. It was under warranty and I got an 80gb replacement for $200 cheaper, but lost out on backwards compatibility which really sucked because I sold my PS2 to get a PS3.
Why would you want backwards compatibility? To play games you already own and like instead of buying new ones? Now now, don’t be ridiculous.
Sarcasm aside, I do wonder how technically challenging it is to keep your system backwards-compatible. I understand console games are written for specific hardware specs, but I’d assume newer hardware still understands the old instructions. It could be an OS question, but again, I’d assume they would develop the newer version on top of their old, so I don’t know why it wouldn’t support the old features anymore.
I don’t want to cynically claim that it’s only done for profit reasons, and I’m certainly out of my depth on the topic of developing an entire console system, so I want to assume there’s something I just don’t know about, but I’m curious what that might be.
It’s my understanding that backwards-compatible PS3s actually had PS2 hardware in them.
We can play PS2 and PS1 games if they are downloaded from the store, so emulation isn’t an issue. I think Sony looked at the data and saw they would make more money removing backwards compatibility, so that’s what they did.
Thankfully the PS3 was my last console before standards got even lower and they started charging an additional fee to use my internet.
I think the 360 failed for the same reason lots of early/mid 2000s PCs failed. They had issues with chips lifting due to the move away from leaded solder. Over time the formulas improved and we don’t see that as much anymore. At least that’s the way I recall it.
It kinda, has, with Fermi, lol. The GTX 480 was… something.
Same reason too. They pushed the voltage too hard, to the point of stupidity.
Nvidia does not compete in this market though, as much as they’d like to. They do not make x86 CPUs, and frankly Intel is hard to displace since they have their own fab capacity. AMD can’t take the market themselves because there simply isn’t enough TSMC/Samsung to go around.
This would be funny if it happened to Nvidia.
Hope Intel recovers from this. Imagine if Nvidia was the only consumer hardware manufacturer…
No one wants that.
Lol there was a reason Xbox 360s had a whopping 54% failure rate and every OEM was getting sued in the late 2000s for chip defects.
Isn’t the 360’s failure rate due to MS rushing to release it before the PS3?
No, it was entirely Nvidias fault
https://www.neogaf.com/threads/console-wars-bumpgate-the-7th-generation-the-truth-behind-rrod-and-ylod.1666164/
AFAIK the cooling was faulty or insufficient which burned the chips out.
Tagging on here: Both the first model PS3 and Xbox 360 were hot boxes with insufficient cooling. Both suffered from getting too hot too fast for their cooling solutions to keep up. Resulting in hardware stress that caused the chips solder points to weaken until they eventually cracked.
Owner of original 60gb PS3 here.
It got very hot and eventually stopped working. It was under warranty and I got an 80gb replacement for $200 cheaper, but lost out on backwards compatibility which really sucked because I sold my PS2 to get a PS3.
Why would you want backwards compatibility? To play games you already own and like instead of buying new ones? Now now, don’t be ridiculous.
Sarcasm aside, I do wonder how technically challenging it is to keep your system backwards-compatible. I understand console games are written for specific hardware specs, but I’d assume newer hardware still understands the old instructions. It could be an OS question, but again, I’d assume they would develop the newer version on top of their old, so I don’t know why it wouldn’t support the old features anymore.
I don’t want to cynically claim that it’s only done for profit reasons, and I’m certainly out of my depth on the topic of developing an entire console system, so I want to assume there’s something I just don’t know about, but I’m curious what that might be.
It’s my understanding that backwards-compatible PS3s actually had PS2 hardware in them.
We can play PS2 and PS1 games if they are downloaded from the store, so emulation isn’t an issue. I think Sony looked at the data and saw they would make more money removing backwards compatibility, so that’s what they did.
Thankfully the PS3 was my last console before standards got even lower and they started charging an additional fee to use my internet.
Intercooler + wet towel got me about 30 minutes on Verruckt
I think the 360 failed for the same reason lots of early/mid 2000s PCs failed. They had issues with chips lifting due to the move away from leaded solder. Over time the formulas improved and we don’t see that as much anymore. At least that’s the way I recall it.
It kinda, has, with Fermi, lol. The GTX 480 was… something.
Same reason too. They pushed the voltage too hard, to the point of stupidity.
Nvidia does not compete in this market though, as much as they’d like to. They do not make x86 CPUs, and frankly Intel is hard to displace since they have their own fab capacity. AMD can’t take the market themselves because there simply isn’t enough TSMC/Samsung to go around.
There’s also Intel holding the x86 patent and AMD holding the x64 patent. Those two aren’t going anywhere yet.
Actually, looks lhe base patents have expired. All the extentions, SSE, AVX are still in effect though
Me too so it keeps AMD on their toes.