Bugalugs McScruffin wrote:
TRB wrote:It wouldn't even work in the USA or Europe just due to latency and the sheer size of new games.
good luck streaming a 20gb game to 100,000 people at the same time 24hrs a day.
It already works in the US and Europe. The size of the game doesn't matter. Player input is sent to a server running the game, the image is sent back to the user's device, in this case a TV.
Distance between the device and the server is the issue. The further away the server the longer it takes for the information to get there.
No it doesn't work.
Just because they ran some demo with a server practically next door with a dedicated connection doesn't mean it somehow 'works'.
The latency is already over 100ms before the data even leaves the building.
and this article isn't talking about having all the game assets stored client side with just a cloud CPU/GPU.
this is talking about having all the assets stored server side and having the images transmitted which means it very much depends on the resolution being transmitted [aka no point having a high fidelity game if you're getting **** looking picture on the other end].
To even come close to the fidelity I get in my games right now you'd be streaming over a gig per hour.
let not forget this is supposed to be after the next gen of consoles so I'd expect it to require upwards of 5gb per hour to get that kind of image quality.
thats about equal to running a 15mb/s connection flat out all the time for the hundreds of thousands or millions of users that are playing at any given point in time.
tell me which network either here or in the US can handle that on top of current usage.
So no, it doesn't 'work' despite some smoke and mirrors tech demo.