I dont think latency will be a huge issue, NVidia's GRID network already has latency lower then a console to tv.. and i think that was on a tv
The problem will be more of what was raised above; with it all being integrated does this mean it will need to be a monopoly on distribution? Or completely freeware design...
To give a better angle at that
this is what they actually said
Cloud gaming today is typically done on a 5 Mbps connection at 720p30Hz, and the latency can be as good as 150 milliseconds on a GeForce GRID server. This is a very similar experience to what you get with today’s game consoles over HDMI on a TV.
i'd like to stress the 'CAN BE
and you are mistaking something... they are not even reducing latency from the server to us (which is the biggest problem for australian given our connection)... that one is IMPOSSIBLE, as in physically impossible without some new method of coding the information or new transmission method that are somehow more efficient than existing one.
what they are doing is reducing the latency that derives from the time it takes for the server to actually process the information and send it back ASAP to the consumer.
Combined, these technologies allow for the efficient, low-latency rendering of games in a remote data center to be streamed in standard H.264 format to any display device with a decoder
so either way, you still need:
A. a local server equipped with this GRID nearby your location (as close as possible naturally).
B. a network infrastructure capable of handling the stream.
and whatever latency comes from the distance between you and the data center (which is really the real problem for us) we'll still have to absorb face first.