Search engines tend to produce neutral garbage, not harmful garbage (i.e. small tidbits of data between an ocean of SEO fluff, rather than completely incorrect facts). LLMs tend to be inaccurate because in an absence of knowledge given by the user, it will sometimes make up knowledge. It's plausible to imagine that they will cover each other's weaknesses: the search engine produces an ocean of mostly-useless data, and the LLM can find the small amount of useful data and interpret that into an answer to your question.
The problem I see with this "cover for each other" theory is that as it stands having a good search engine is a prerequisite to having good outputs from RAG. If your search engine doesn't turn up something useful in the top 10 (which most search engines currently don't for many types of queries) then your llm will just be summarizing the garbage that was turned up.
Currently I do find that Perplexity works substantially better then Google for finding what I need, but it remains to be seen if they're able to stay useful as a larger and larger portion of online content just AI generated garbage.
> Search engines tend to produce neutral garbage, not harmful garbage (i.e. small tidbits of data between an ocean of SEO fluff, rather than completely incorrect facts)
Wasn't google AI surfacing results about making pizza with glue and eating rocks? how is that not harmful garbage?
then you are blissfully unaware of how much data is already being interpreted for you by computer algorithms, and how much you probably actually really like it.
This comes off as condescending. As things have gotten more algorithmic over the last two decades, I've noticed a matching decrease in the accuracy and relevance of the information I seek from the systems I interact with that employ these algorithms.
Yes, you're right that there are processing algorithms behind the scenes interpreting the data for us. But you're wrong: I fucking hate it, it's made things worse, and layering more on top will not make things any better.
I don't think anyone can disagree. If you ask someone to give you an interpretation of the works of, say, Allen Ginsberg, or of the theory of relativity, and they come back with a pile of documents ordered in some fashion, you won't be satisfied because that's not what you asked for.
99.99% of all data is complete garbage and impossible for a human to sift through. Most spam email doesn't even end up in your spam inbox. It gets stopped long before that.