It is (maybe not directly but very insistently) advertised as taking many jobs soon.
And counting stuff you have in front of yourself is basic skill required everywhere. Counting letters in a word is just a representative task for counting boxes with goods, or money, or kids in a group, or rows on a list on some document, it comes up in all kinds of situations. Of course people insist that AI must do this right. The word bag perhaps can't do it but it can call some better tool, in this case literally one line of python. And that is actually the topic the article touches on.
People always insist that any tool must do things right. They as well insist that people do things right.
Tools are not perfect, people are not perfect.
Thinking that LLMs must do things right, that people find simple, is a common mistake, and it is common because we easily treat the machine as a person, while it only is acting like one.
> Thinking that LLMs must do things right, that people find simple, is a common mistake
Show me any publicly known visible figure that tries to rectify this. Everyone peddles hype, there's no more Babbage as in the "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" anecdote.
People and tools that don't do things right aren't useful. They get replaced. Making do with a shitty tool might make sense economically but not in any other way.
If you follow that reasoning, no person is useful and no tool is useful.
The little box I'm filling now is, compared to a lot of other interfaces, a shitty interface. That doesn't mean it isn't useful. Probably it is getting replaced, only with a slightly better inferface.
The karma system is quite simplistic and far from perfect. I'm sure there are ways to go around it. The moderators make mistakes.
That doesn't mean the karma and moderation are not useful. I hope you get my point but it's fine if we disagree as well.
It is advertised as being able to "analyze data" and "answer complex questions" [0], so I'd hope for them to reliably determine when to use its data-analysis capabilities to answer questions, if nothing else.
Also, no matter what hype or marketing says: GPT is a statistical word bag with a mostly invisible middleman to give it a bias.
A car is a private transportation vehicle but companies still try to sell it as a lifestyle choice. It's still a car.