2.4GHz wifi is not suitable for two big reasons, interference and low bandwidth. 2.4GHz wifi in any kind of suburban or city environment and sometimes even in rural will be congested with other networks, microwaves, other appliances, etc causing massive speed degradation or fluctuations. The range of 2.4GHz is just too large for all the equipment that uses it in today’s world. In my previous apartment complex for example my phone could see 35 distinct 2.4GHz wifi networks while only 3 at max can operate without interfering with each other. In that same building i could only see 13 5GHz networks. Which brings me to the second issue of bandwidth
2.4GHz at least here in the US only has channels 1, 6, and 11 that will not interfere with each other. if anyone puts their network between these three channels it will knock out both the one below and the one above. Channel 3 would interfere with both channels 1 and 6 for example. By going up to 5GHz you have many more free channels, fewer networks competing for those channels, and higher bandwidth channels allowing for much higher throughput. 2.4GHz allows 40MHz wide channels which in isolation would offer ~400mbps, but you will never see that in the real world.
Personally, i think OEMs should just stop including it or have it disabled by default and only enable it in an “advanced settings” area.
Edit: I am actually really surprised at how unpopular this opinion appears to be.
Yes but that’s expensive and only part of newer WiFi standards, and almost nothing implements it. Most devices barely even support basic MIMO.
The point remains that I won’t go replace lightbulbs just so they run 5 GHz WiFi. It’s dumb and pointless and just generates a ton of completely unnecessary and avoidable ewaste, just to avoid using a network band nobody cares about anymore.
In an ideal world yes, everything would be 11ax already on 6GHz spectrum. But this is the real world, a world where 10-20 year old WiFi devices still connect to 2.4 GHz networks and are still useful and most importantly, still works perfectly fine. WiFi 11n chips are dirt cheap, why should we have to add an extra 5-10 bucks on a lightbulb just so it’s on a modern WiFi standard when all it needs to receive is an RGBA value to know what color and how bright it should be. At that point it’s an economics problem not a tech problem. Those devices couldn’t even handle maxing out 11n even if they wanted to anyway, they barely handle a web server.