Yeah, then it's likely gonna have some latency problems. But latency will decrease with time. I'd imagine in maybe 2 to 5 years most US consumers might have the latency profile to use a system like that comfortably. Probably not now though. Especially if you're playing, say, Fortnite, against 95 other people who have their game loaded locally on a fast network. You're just asking for a frustrating day.
For these streaming services to work, you really gotta have the data center within 1000 miles of you. If they only have a service in say, California, and you're on the east coast, well...even a round trip at the speed of light is about 30-35 ms. Add in the fact that electricity doesn't actually travel at the speed of light, processing time for routers, and delays for video compression/decompression, and you're looking at at least 60 ms delay on inputs that probably can't possibly be reduced.
I think even 1000 miles is a bit generous, since you likely don't get networking in a straight line between your house in the data center.
In Utah I get traffic routed through Denver and a few other states east of me before hitting certain servers hosted in California.
Unless Google has figured out a way to increase the speed of light, I don't think this new service is really going to change my mind about buying dedicated hardware to play games.
I wonder if a multiplayer shooter would be playable if the game was both hosted and rendered for all players within the same data center. You still have latency to the server when playing multiplayer, so if you eliminate that by hosting the game next to the computer rendering it, it might work well?