Iâ€™ve been wondering whether something like this was feasible at all. The prospect of being able to play the best PC games without actually requiring the hardware to do so is quite a tantalizing one. On top of that, unlike the Phantom, it appears to be an actual product (it was test-driven by gaming reporters at GDC). There are a few things that might turn into problems, though.
One of the major selling points of cloud computing is the ability to scale your applications as demand increases. This is all well and good when your app is a web application thatâ€™s designed to be scalable from the ground up, but I imagine making an app designed to run on a single machine into something that can run into a cloud must be pretty difficult. Iâ€™m wondering if they got the content providers to actually go back and make their apps scale across multiple machines (seems rather unlikely), or if they just have the apps running on virts with a graphics API implementation that just turns around and passes rendering/processing tasks off to a giant server farm (also seems unlikely).
On the other hand they could just be spinning up a virtual machine every time someone starts a session, in effect running multiple gaming sessions on a single box. This actually sounds like it might be easier to implement, although scaling horizontally would require purchasing more hardware as traffic ramps up. On top of that I donâ€™t know how far support for dedicated graphics on virtual machines has progressed.
Iâ€™d love to know how theyâ€™re tackling this problem (seeing how I deal with a company that has to deal with massive scale all the time)â€¦I wonder if theyâ€™ve published any technical documents about their solution.
A major advantage PC gaming has over its console counterpart is the ability to tweak settings to improve performance. I can see a service like OnLive allowing you to fiddle around with the settings present from within the game executable itself, but what about stuff like custom scripts and binds for multiplayer FPSes, installing mods, and otherwise tweaking your settings? It doesnâ€™t sound like any of that will be possible with a service like this.
OnLive will apparently require at least a 1.5 Mbps connection for standard def (480p) resolutions and a 5Mbps connection for hi-def (Iâ€™m assuming 720p) resolutions. Assuming theyâ€™re actually rendering the source game at 1280×720 in 32-bit colour, each frame will be about (1280 * 720 *32)/10^6 = 29.5 Megabits. To get a playable framerate you need at least 30 frames per second, which translates to about 885 Megabits of data transmitted every second. This clearly vastly exceeds the throughput of a 5Mbps connection, so Iâ€™m led to believe they must be either employing compression on the framebuffer output, rendering at a lower resolution and upscaling, or a combination of the two. Either way it sounds like at least graphics-wise, OnLive wonâ€™t be a substitute for the real thing at all. On top of that ISPs these days are incredibly finicky about bandwidth usage, and a service like this sounds like a bandwidth hog thatâ€™d be liable to get throttling slapped on your account.
And of course thereâ€™s the issue of input lag â€“ your keystrokes and mouse inputs need to make a full round trip to the OnLive servers before theyâ€™ll be registered. I donâ€™t have any data on typical mouse and keyboard response times, but I am a little skeptical about whether theyâ€™re significant compared to typical network round trip times.
In any case, thereâ€™s an interview with the CEO of OnLive at GDC here, so go have a watch if youâ€™re interested:
Most interestingly he mentions a solution that deals with the input lag problem I mentioned that took seven years to develop. I have no idea what this solution is â€“ I suppose Iâ€™ll need to actually try it out to be convinced one way or the other.
One thought on “Gaming in the cloud?”
[…] near me was spun up, and I immediately gave it a try. Unfortunately my concerns with the service, which I outlined over a year ago, turned out to be pretty dead […]