I think we can agree that Americans have more money. Or higher paying jobs. Well, except for those who don't (but that's their own fault right?)
But other than money, is the US really better at, well, anything? If you are in the US, and you don't have money - what do you have?
Now sure, I'm making a broad statement, but I'd argue that outside of "we have more money" there's not a whole lot of America that's appealing.
I'll shorten the obligatory "if we are so bad, why does everyone want to come here" question by pointing out;
Not everyone does. But the US has some poor neighbors, so it can seem that way. And;
There is always some group of people who see money as the prime motivator, and hence the US the prime destination.
I say this not to pick on the US, or say its a terrible place. But to encourage some objective introspection. I suspect that the perception that "America is the best country on earth" is the primary stumbling block that prevents America from being anywhere close to the best country on earth.