Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Seems like a lot of guessing. I'm not convinced AI aren't "thinking" in the same manner we are. Eventually we'll have models trained on speech only, or modes of expression we can't fathom. Humans have no moat.


I wonder if AIs would even decide to do things in the same way we do. Most of what humans do has come from generations of having to operate within constraints that change over time. AI gets to leapfrog those constraints for a whole different kind.

Why would we assume what comes from them will even aspire to "being like humans"?

The number of reasons AIs might not add the same things to an audio version of a book (the context we're talking about in this thread) is essentially infinite. It seems vastly more likely that they won't add what the author adds than that they will.

Humanity may not have a moat, but each individual human does, especially when it comes to art, where I'd include writing.


If humans have unique capabilities individually, we would have them collectively as well. I have yet to see a single argument that any biological process can't be replicated or synthesized. Until there is such an argument, it's special pleading.

I can't say anything an an AIs aspirations, but that fact that we're imbuing them with all of our collective data, means they will be skewed to perceive the world similarly us, at least initially.


+1000




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: