Hacker News new | past | comments | ask | show | jobs | submit login

yeah that's actually what I was thinking about. I have a PhD in physics, so I easily notice when ChatGPT just keeps agreeing with me even though we're on very shaky ground. But I worry about the times it does this when we're talking about stuff I'm not as knowledgeable about.

And you can see the influx of people on r/physics and the like who are convinced they've solved dark matter/quantum gravity/... because ChatGPT kept agreeing with them when they presented their ideas to it. Just recently there was a post by a guy who essentially "rediscovered" 17th century physics with the help of ChatGPT but was convinced his formula would explain dark matter because ChatGPT told him so.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: