Be VERY careful using Kagi this way -- I ended up turning off Kagi's AI features after it gave me some comically false information based on it misunderstanding the search results it based its answer on. It was almost funny -- I looked at its citations, and the citations said the opposite of what Kagi said, when the citations were even at all relevant.
May I ask how you know those 999 answers were correct, and how would you have been sure to catch a mistake, misinterpretation or hallucination in any of those?
It's not only Kagi AI but Kagi Search itself has been failing me a lot lately. I don't know what they are trying to do but the amount of queries that find zero results is impressive. I've submitted many search improvement reports in their feedback website.
Usually doing `g $query` right after gives me at least some useful results (even when using double quotes, which aren't guaranteed to work always).
Happens about 200 times a day (0.04% of queries), very painful for the user we know, still trying to find root cause (we have limited debugging capabilities as not storing much information). it is on top of our minds.
Yeah, that's totally fair. I just think about all the people to whom I've had to explain LLM hallucinations, and the surprise in their faces, and this feature gives me some heebie-jeebies
It's a very "not ready for primetime" feature