>We do not yet have the learnable stream of consciousnesses of a co-worker in AI, but we do have kind of this oracular amount of knowledge that can be brought forward.
Except if there are some plane to make AGI agents have their own mundane-human-like life with issues unrelated to the businesses problem at stack they are supposed to do, where will they take serendipity inputs?
Except if there are some plane to make AGI agents have their own mundane-human-like life with issues unrelated to the businesses problem at stack they are supposed to do, where will they take serendipity inputs?