Friday, May 18, 2012

Consoles in the Cloud

I meant to write about cloud gaming a long time ago. Actually all the way back to when OnLive first debuted. Back then, most people didn't think it was possible. Gamers who made a living managing networks would argue tirelessly on forums about how it wasn't physically possible. The latency would be too great, the image quality would suck ass, and it wouldn't be cost-effective. To their credit, they were partially right, except for the whole impossible thing.

So OnLive debuted, and it was actually shockingly new technology, enough so to make jaded technology buffs gape in awe. No, the latency wouldn't make twitch gaming very enjoyable, and the image quality might have served only slightly above current-gen consoles, with some compression artifacts that non-cloud gamers would never encounter. But it was working. You could in fact play games with all the electronic muscle hiding away behind miles of wiring. It was an extremely novel concept, and while it didn't take off with any great zeal, it was a step towards what some might say is the future.

Why is that though? Why does cloud gaming necessarily have to be the future?

Thing is, gaming is an expensive hobby. For most, it's all justified because in the end, the memories you have from those experiences, especially in modern times when 90% of your gaming can involve your friends on some level, is priceless. At their debut, a console can cost at least $400. Sometimes they require additional accessories, be it as simple as some cables, to more expensive stuff like memory cards, extra controllers, network accessories, headsets, etc. The games are just past the halfway point towards $100, with very little deviation. It used to be that the disc medium was a real savings compared to cartridges, but now it seems we've come full circle again.

Up to now, the cost was even more justified because each generation meant a huge ass leap. You could take screenshots of the best looking games for every milestone console, and see major epochs traversed in realism. Just transitioning from 2D to 3D meant a paradigm shift in not just how games looked, but how they played. For those who didn't even value graphics, the benefits were still obvious.

But today, we're sitting in an era of refined 3D. Not only do characters have articulated fingers, but you can count the stubble on their chins, and feel the goosebumps form on your skin as you take a very convincing dive into rippling, sparkling water. Shit isn't just easily discernible, it's a reasonable facsimile. It's sort of like the few generations leading up to the Super Nintendo and Sega Genesis. Eventually we got to where there wasn't a whole lot to add to the 2D experience, and right now we're at the same point, just in a different time period.

But short of a 3D to 4D transition to bring on a new era, we're stuck with bucket-loads of investment and weak returns towards continued 3D advancement. This isn't new to anybody though. Most gamers with a decent awareness of the going-ons of the industry know there isn't much to look forward to in graphics. Diminishing returns has been felt in the PC realm for a while now, and its console gamers' turn to feel it too.

We got a preview of that very recently with the leaked shots of an Unreal Engine 4 demo. Take those images and put them up against console games like The Witcher 2, Crysis 2, and Uncharted 3, and you'll have a hard time spotting the major advantages. Assuming you're reading this, you're probably fairly knowledgeable about such things and have an eye for such details. If you can't see many differences, how do you think Average Joe will fare?

That presents a major problem in trying to sell new consoles. Maybe the differences are more apparent in motion than in screenshots. Whether or not that's the case, a lot of the marketing for games and new hardware still stems from still pictures printed in magazines, posters, flyers, and games packaging itself. The fact that no generation before now has had ANY trouble showing off their merits via such means indicates that this next gen will probably have a huge uphill battle with consumers. I'll go as far as to say if you can't sell a new console based on screenshots, you can't sell the console period. Certainly not for $400 or more.

That article I linked to also indicates that Epic Games is actually lobbying to increase the power of the new consoles, just so they can even run that level of fidelity. What this means is that if Epic Games is unsuccessful, the differences will be even less apparent. What happens to gaming if consumers are underwhelmed and the next generation of consoles can't make sales projections?

Well, that's where cloud gaming comes in (duh). Developers will just find new avenues to pursue customers, and while that may not be good news for Microsoft or Sony, for gamers it seems like a pretty tantalizing notion. Take away the upfront cost for the new console, take away the limitations on performance that a console would impose, and provide a means for gaming that is largely indiscernible from how gaming was before, except with all the benefits of the next generation. OnLive was flawed in a lot of ways, and might have turned off a lot of people to cloud gaming, but slowly the technology is being improved and the issues are being ironed out.

NVIDIA just took on a direct role in pushing the idea. They've created the GeForce Grid, which takes the latest and greatest GPUs and optimizes them in ways that other companies wouldn't be able to do with off-the-shelf parts. They promise improved latency and image quality, closing the gap further between local console-based gaming, plus smarter use of resources. To be clear, they're working with Gaikai to make this happen, an already established OnLive competitor. They hope to deliver on promises that OnLive failed to, such as cloud gaming support built directly into TVs. NVIDIA has ties to display manufacturers that allowed them to get decent support for 3D Vision, so this might actually happen.

NVIDIA is probably pulling for this technology for multiple reasons. Obviously they probably see the writing on the wall just as plainly as anybody, and hope to curtail the issue of plateauing graphics by simply eliminating all performance limitations from developers. And whether or not the next generation of consoles is really doomed (I don't actually believe that, just so you know), NVIDIA isn't involved in any of them. Cloud gaming, should it ever take off, has the potential to be even more beneficial for them than PC gaming, as very few PC gamers actually buy top-of-the-line hardware, and even fewer games will make use of them. NVIDIA can continue their decent sales of midrange graphics cards, and hang onto the really gigantic, expensive GPUs for the HPC market, of which cloud gaming is a part of.

For consumers, this has loads of benefits too, if they can work out the kinks. I probably don't have to list the more obvious ones, but one big added benefit is an all-digital game library. Yeah, we have digital distribution now, and will probably have even more of it in the future, but this is even better. Every game is instant-on -- loaded in seconds, not minutes -- with no downloading. That goes for every device you own that supports it. It's like Steam, but on steroids secreted from the anuses of winged unicorns. And it will be on a constantly updated, state-of-the-art hardware platform. We already know from the world of television and movies that instant-on streaming is a highly successful concept, and undoubtedly the same should be true of gaming. It may be less exciting for collectors and hardware enthusiasts, but it has immeasurable potential for gaming.