Ultimately this is about mastery of a tool. The problem is that you can’t teach mastery.
I can’t tell someone how to drive in ice in a way where they can really understand it. I can’t explain how certain specific news sources are biased and how to critically think. I can’t explain how to cut wood on a table saw so it’s perfectly straight. The only way to learn is through repeated usage and practice.
You can tell users that a LLM can make mistakes — and many tools do — but what does making mistakes really mean? Will it give it a recipe for a cake when I ask for a cupcake? Does it give 14 if I ask to add 3 and 4? Will it agree with me even when I suggest something totally wrong? What does hallucinate mean? That means it will give me a fantasy story if I ask how to change my oil filter?
We don’t mind rideshare at all.