If you’re here because of the AI headline, this is important to read.
We’re looking at how we can use local, on-device AI models – i.e., more private – to enhance your browsing experience further. One feature we’re starting with next quarter is AI-generated alt-text for images inserted into PDFs, which makes it more accessible to visually impaired users and people with learning disabilities.
They are implementing AI how it should be. Don’t let all the shitty companies blind you to the fact what we call AI has positive sides.
The term is so overused and abused that I’m not clear what they’re even promising. Are they localizing a LLM? Are they providing some kind of very fancy macroing? Are they linking up with ChatGPT somehow or integrating with Co-pilot? There’s no way to tell from the verbage.
And that’s not even really Mozilla’s fault. It’s just how the term AI can mean anything from “overhyped javascript” to “multi-billion dollar datacenter full of fake Scarlett Johansson voice patterns”.
there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.
not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame
AI generally means machine learned neural networks these days
Right, but a neural network traditionally rules out using a single local machine. Hell, we have entire chip architecture that revolves around neural net optimization. I can’t imagine needing that kind of configuration for my internet browser.
not sure how they’re going to handle low-resource machines
One of the perks of Firefox is its relative thinness. Chrome was a shameless resource hog even in its best days, and IE wasn’t any better. Do I really want Firefox chewing hundreds of MB of memory so it can… what? Simulate a 600 processor cluster doing weird finger art?
i mean, i’ve worked in neural networks for embedded systems, and it’s definitely possible. i share you skepticism about overhead, but i’ll eat my shoes if it isn’t opt in
AI has become truly meaningless term for everything and nothing.
Not to mention all the justified hate it received. It’s probably time to kill it once again and delegate it to the future like usual every 10 years or so starting with Deep Blue
If you’re here because of the AI headline, this is important to read.
They are implementing AI how it should be. Don’t let all the shitty companies blind you to the fact what we call AI has positive sides.
The term is so overused and abused that I’m not clear what they’re even promising. Are they localizing a LLM? Are they providing some kind of very fancy macroing? Are they linking up with ChatGPT somehow or integrating with Co-pilot? There’s no way to tell from the verbage.
And that’s not even really Mozilla’s fault. It’s just how the term AI can mean anything from “overhyped javascript” to “multi-billion dollar datacenter full of fake Scarlett Johansson voice patterns”.
there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.
not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame
Right, but a neural network traditionally rules out using a single local machine. Hell, we have entire chip architecture that revolves around neural net optimization. I can’t imagine needing that kind of configuration for my internet browser.
One of the perks of Firefox is its relative thinness. Chrome was a shameless resource hog even in its best days, and IE wasn’t any better. Do I really want Firefox chewing hundreds of MB of memory so it can… what? Simulate a 600 processor cluster doing weird finger art?
i mean, i’ve worked in neural networks for embedded systems, and it’s definitely possible. i share you skepticism about overhead, but i’ll eat my shoes if it isn’t opt in
I don’t doubt it’s possible. I’m just not sure how it would be useful.
I use my local machine for neutral networks just fine
There are a lot of knee jerk reactions in the comments. I hope few of those commenters have read the article or, at the least, your comment.
thats most of the internet, just reacting to headlines.
We’re also using machine learning for the local site translation. The AI buzzword is doing more damage than good PR.
AI has become truly meaningless term for everything and nothing.
Not to mention all the justified hate it received. It’s probably time to kill it once again and delegate it to the future like usual every 10 years or so starting with Deep Blue