The way I see it, it's less about the technicalities of accuracy and more about the long term human and societal problems it presents when widely adopted.
On one hand, every new technology that comes about unregulated creates a set of ethical and in this particular case, existential issues.
- What will happen to our jobs?
- Who is held accountable when that car navigation system designed by an LLM went haywire and caused an accident?
- What will happen with education if we kill all entry level jobs and make technical skills redundant?
In a sense they're not new concerns in science, we research things to make life easier, but as technology advances, critical thinking takes a hit.
So yeah, I would say people are still right to be weary and 'bullish" of LLMs as it's the normal behaviour for disruptive technology, and one will help us create adequate regulations to safeguard the future.
On one hand, every new technology that comes about unregulated creates a set of ethical and in this particular case, existential issues.
- What will happen to our jobs?
- Who is held accountable when that car navigation system designed by an LLM went haywire and caused an accident?
- What will happen with education if we kill all entry level jobs and make technical skills redundant?
In a sense they're not new concerns in science, we research things to make life easier, but as technology advances, critical thinking takes a hit.
So yeah, I would say people are still right to be weary and 'bullish" of LLMs as it's the normal behaviour for disruptive technology, and one will help us create adequate regulations to safeguard the future.