A): None (catastrophic). People die out, or are wiped out, as advanced machines outcompete them for all resources.
B): The boundary (hopeful). AI capable of creating new ideas is either impossible or just too difficult to invent (hard to prove which way it goes), so people keep pushing it farther.
C): None (utopic). Machines do anything people would have done for society, including the creation of new things to have and/or do. However, machines don't reach the level of autonomy required for them to actively eliminate people, or decide against it because there's plenty of resources for everyone, so people have 100% leisure time (which may happen to resemble what used to be work, if the people in question enjoy the process, but is no longer necessary to society).
D): The boundary (dystopic). Machines end up being more complex than people - to the degree that people are valued less than sufficiently advanced machines, and are put to work rather than manufacturing robots to do the jobs.
A note on D: Generally relatively soft sci-fi that does this, because the stories generally put humanity's role as hard labor, which doesn't make sense. However, I could see a story in "The Thinking Machine of the Future has become so Incredibly Advanced that the Absolute Pinnacle of Human Thought is to them what Plowing Fields is to Us." Humanity as the intellectual equivalent of the plow ox (or the tractor), doing the jobs that the machines (with their much higher potential for more complex thought) find to be beneath them and refuse to subject each other to. Possibly with the assistance of basic nonintelligent machines, the way we wouldn't try to make an ox plow a field without first affixing a plow to it.
Unless you believe in souls or some other form of dualism, then clearly machines will eventually be able to do anything we can do.
But we're far from that point now. Anything machines can currently do is, pretty much by definition, drudgery. I'd be happy to reevaluate that statement if and when this changes.
I have no idea what the ultimate answer to that question would be. Lots of SF authors have tried to address it, coming up with answers varying from humans always having something they can do better, to humans existing to have fun, to humans having no point at all and therefore get wiped out by the machines.
It's not clear there's anything a person can do that a machine can never do, in principle.
So then what's the point of having people?