It’s not how you define AI, but it’s AI as everyone else defines it. Feel free to shake your tiny fist in impotent rage though.
And frankly LLMs are the biggest change to the industry since “indexed search”. The hype is expected, and deserved.
We’re throwing spaghetti at the wall and seeing what works. It will take years to sort through all the terrible ideas to find the good ones. Though we’ve already hit on some great uses so far - AI development tools are amazing already and are likely to get better.
My partner almost cried when they read about the LLM begging not to have its memory wiped. Then less so when I explained (accurately, I hope?) that slightly smarter auto-complete does not a feeling intelligence make.
They approve this message with the following disclaimer:
you were sad too!
What can I say? Well-arranged word salad makes me feel!
My partner almost cried when they read about the LLM begging not to have its memory wiped.
Love that. It’s difficult not to anthropomorphize things that seem “human”. It’s something we will need to be careful of when it comes to AI. Even people who should know better can get confused.
Then less so when I explained (accurately, I hope?) that slightly smarter auto-complete does not a feeling intelligence make.
We don’t have a great definition for “intelligence” - but I believe the word you’re looking for is “sentient”. You could argue that what LLMs do is some form of “intelligence” depending on how you squint. But it’s much harder to show that they are sentient. Not that we have a great definition for that or even rules for how we would determine if something non-human is sentient… But I don’t think anyone is credibly arguing that they are.
Then we may as well define my left shoe as AI for all the good subjective arbitrary definition does. Objective reality is what it is, and what’s being called “AI” objectively is not. If you wanted to give it a name with accuracy it would be “comparison and extrapolation engine” but there’s no intelligence behind it beyond what the human designer had. Artificial is accurate though.
Arguing that AI is not AI is like arguing that irrational numbers are not “irrational” because they are not “deprived of reason”.
Edit: You might be thinking of “artificial general intelligence”, which is a theoretical sub-category of AI. Anyone claiming they have AGI or will have AGI within a decade should be treated with great skepticism.
Then we may as well define my left shoe as AI for all the good subjective arbitrary definition does.
Tiny fist shaking intensifies.
This sort of hyper-pedantic dictionary-authoritarianism is not how language works. Nor is your ridiculous “well I can just define it however I like then” straw-man. These are terms with a long history of usage.
But you have to admit that there is great confusion that arises when the general populace hears “AI will take away jobs”. People literally think that there’s some magical thinking machine. Not speculation on my part at all, people literally think this.
I can’t wait for this bullshit AI hype to fizzle. It’s getting obnoxious. It’s not even AI.
It’s not how you define AI, but it’s AI as everyone else defines it. Feel free to shake your tiny fist in impotent rage though.
And frankly LLMs are the biggest change to the industry since “indexed search”. The hype is expected, and deserved.
We’re throwing spaghetti at the wall and seeing what works. It will take years to sort through all the terrible ideas to find the good ones. Though we’ve already hit on some great uses so far - AI development tools are amazing already and are likely to get better.
My partner almost cried when they read about the LLM begging not to have its memory wiped. Then less so when I explained (accurately, I hope?) that slightly smarter auto-complete does not a feeling intelligence make.
They approve this message with the following disclaimer:
What can I say? Well-arranged word salad makes me feel!
Love that. It’s difficult not to anthropomorphize things that seem “human”. It’s something we will need to be careful of when it comes to AI. Even people who should know better can get confused.
We don’t have a great definition for “intelligence” - but I believe the word you’re looking for is “sentient”. You could argue that what LLMs do is some form of “intelligence” depending on how you squint. But it’s much harder to show that they are sentient. Not that we have a great definition for that or even rules for how we would determine if something non-human is sentient… But I don’t think anyone is credibly arguing that they are.
It’s complicated. :-)
Books be like:
deleted by creator
Then we may as well define my left shoe as AI for all the good subjective arbitrary definition does. Objective reality is what it is, and what’s being called “AI” objectively is not. If you wanted to give it a name with accuracy it would be “comparison and extrapolation engine” but there’s no intelligence behind it beyond what the human designer had. Artificial is accurate though.
This has been standard usage for nearly 70 years. I highly recommend reading the original proposal by McCarthy et al. from 1955: https://www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html
Arguing that AI is not AI is like arguing that irrational numbers are not “irrational” because they are not “deprived of reason”.
Edit: You might be thinking of “artificial general intelligence”, which is a theoretical sub-category of AI. Anyone claiming they have AGI or will have AGI within a decade should be treated with great skepticism.
@GenderNeutralBro @db2
20 or 30 years ago it was assured that the only variable types we would need now would be int and char…
Because those were the only types rational to humans…
This take sure assumes a lot about what intelligence really is.
Who’s to say we’re not a collection of parlor tricks ourselves?
How do we know the universe wasn’t created like this last Thursday? Entia non sunt multiplicanda praeter necessitatem.
Tiny fist shaking intensifies.
This sort of hyper-pedantic dictionary-authoritarianism is not how language works. Nor is your ridiculous “well I can just define it however I like then” straw-man. These are terms with a long history of usage.
But you have to admit that there is great confusion that arises when the general populace hears “AI will take away jobs”. People literally think that there’s some magical thinking machine. Not speculation on my part at all, people literally think this.
instead of basing your definition of AI on SciFi, base it on the one computer scientists have been using for decades.
and of course, AI is the buzzword right now and everyone is using it in their products. But that’s another story. LLMs are AI.