I'm really not sure how you can argue that "AI is great at explaining code snippets" while also acknowledging that it will just give you flat out wrong answers some times.
Either it's good at explaining and is right, or is bad at explaining and is wrong.
Applying the logic of it being "right most of the time" seems really bad for a tool applied to a reference documentation website.
Sorry, if that wasn't clear but I think MDN is not only reference documentation. I agree, for the reference part the AI shouldn't do more than trying to point you to the right parts of the text. But for learning things, nice explanations, even if sometimes slightly off, can be a lot better to digest than reference documentation.
I'm not saying it's great yet, but there is potential for having something that can hand-wave away some details like a human would when explaining something to a beginner.
Either it's good at explaining and is right, or is bad at explaining and is wrong.
Applying the logic of it being "right most of the time" seems really bad for a tool applied to a reference documentation website.