Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it good for one person (the writer) to save time, only for lots of other people (the readers) to have to do extra work to understand if the work is correct or hallucinated?



Is it good for one person (the writer) to ask a loaded question just to save some time on making their reasoning explicit, ony for lots of other people (the readers) to have to do extra work to understand what the argument is?


> Is it good for one person (the writer) to save time, only for lots of other people (the readers) to have to do extra work to understand if the work is correct or hallucinated?

This holds true whether an LLM/AI is used or not — see substantial portions of Fox News editorial content as an example (often kernels of truth with wildly speculative or creatively interpretive baggage).

In your example, a responsible writer who uses AI will check all content produced in order to ensure that it meets their standards.

Will there be irresponsible writers? Sure. There already are. AI makes it easier for them to be irresponsible, but that doesn’t really change the equation from the reader’s perspective.

I use AI daily in my work. I describe it as “AI augmentation”, but sometimes the AI is doing a lot of the time-consuming stuff. The time saved on relatively routine scut work is insane, and the quality of the end product (AI with my inputs and edits) is really good and consistent.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: