Idly chatting with the search box doesn't strike me as the most productive use of my time.
Instant answers or whatever they're called already produce direct answers plus they cite sources and provide links which is what everyone seems to think is the solution to the "LLMs make stuff up" problem.
Not to mention they're faster and cheaper to run.
Only truly practical use case I can think of is summarizing articles or writing them which makes more sense as a word processor or browser add-ons
People who want to get rich will tell you it's the next greatest thing that will revolutionize the industry.
Personally, I've been annoyed at how confidently wrong ChatGPT can be. Even when you point out the error and ask it to correct the mistake it comes back with an even-more-wrong answer. And it frames it like the answer is completely, 100% correct and accurate. Because it's essentially really deep auto-complete, it's designed to generate text that sounds plausible. This isn't useful in a search context when you want to find sources and truth.
I think there are useful applications for this technology but I think we should leave that to the people who understand LLM's best and keep the charlatans out of it. LLM's are really interesting and have come a long way by leaps and bounds... but I don't see how replacing entire institutions and processes by something that is only well understood by a handful of people is a great idea. It's like watering plants with gatorade.