Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’ve noticed an increasing number of sites that provide an insane amount of text to answer a simple question.

They almost always follow some sort of structure where the actual answer to your question is all the way at the bottom of the page.

Most of the content appears relevant on the surface, but when you actually read it, it’s completely generic junk. Stuff that a high schooler would use to fluff an essay to hit a word limit.




Blame Google. A few years back, they decided that “topical authority” was important, and a page that targeted as many keywords as possible was “better”. A bunch of SEOs published studies showing how pages with 2,000+ keywords ranked higher, and then the floodgates opened with every company fluffing up their pages with 2000 words of BS just to appeal to Google.


"They almost always follow some sort of structure where the actual answer to your question is all the way at the bottom of the page."

You have to scroll past the adverts - that's why they exist. These sites are generated from templates.


Many don't even have an answer. They simply conclude that they don't know, after having spent pointless paragraphs on filler. "Well there you have it" is a common expression I see.


Lately a lot of these seem to be actually generated with LLMs. You can usually tell from the high school essay structure, often ending with a paragraph along the lines of "in conclusion there are many benefits but also many drawbacks".


Those sites predate llm’s. I often wonder how much better things like chatgpt would perform if their training data did not include seo spam.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: