Monday, September 21, 2009

NVIDIA PhysX

PhysX has always been met with skepticism from all sides of the gaming industry. From its roots as a $250 add-in card to its current incarnation as the forefront of general purpose GPU processing for use in games, the value proposition has been one that few have been able to grasp despite all the promises and possibilities.

Now, admittedly I thought PhysX really had a new lease on life when NVIDIA took hold of it a few years ago. I mean it went from an overly expensive piece of useless hardware to a free feature upgrade for existing NVIDIA card users, something virtually unheard of in the industry. The wonderful thing about GPGPU technology is that as long as you can program it with the given language, you can give people anything you want with it, and have it run on any piece of hardware that supports it. Its one of the advantages touted by Larrabee, and at least that card has the potential of no longer being bound by hardware-based DirectX limitations, as, in theory, the software renderer can adapt to whatever new DX generation comes out, with the only limitation in the future-proofing prospects being raw performance (which is a big deal, of course).

But despite the dramatically increased install-base that moving PhysX to CUDA afforded it, developers were still sheepish about supporting it. I mean after all, it's a PC-only feature when the industry these days develops their games centered around consoles, and if you have to do anything more than increase the size of the textures and put in an option to change the resolution, it hardly seems worth it to them. But NVIDIA has built many of their successes by throwing money around, and TWIMTBP program is a great way of getting PhysX in the hands of gamers. Yet even then, the technology seemed reserved for only the most frivolous of enhancements, be it extra sparks or shrapnel coming off of objects when you shoot them, or pieces of cloth or paper shifting around pointlessly. It's all well and good as a free addition for those with the compatible equipment, but as a key selling point, it was less than spectacular.

NVIDIA seemed to be in denial for a while, touting PhysX as one of the key advantages of their hardware for many GPU releases to come after its introduction, and the public, press, and generally those with good sense simply weren't buying a word of it. Add to that in some cases it killed performance such that any significant amount of PhysX effects acted more as a detriment to the gaming experience than anything else. I keenly remember the special PhysX levels in UT3, and the free crappy multiplayer shooter Warmonger. A couple stuttering levels and shitty games weren't going to garner it much attention.

So then NVIDIA got more aggressive. They started pursuing not just the crappy games as they had before, but anticipated PC ports of popular console games. It was in these ports NVIDIA saw a golden opportunity, because as the developers and publishers needed their help to lessen the burden with adding special support for the higher PC standards without cutting into the cash-in prospects that porting to a new platform brings, NVIDIA could add in their own optimizations, and more importantly, insert their goldenboy marketing feature in the process. For PhysX to be the performance-topping, table-turning ace-in-the-hole they needed it to be, they had to spread it out as fast and as wide as possible to every game people may or may not give a damn about. It had to be the one thing the competition did not have that they could tout, even when all other aspects of their technology fell flat.

It started with Mirror's Edge, a game that was to be NVIDIA's second chance after Unreal Tournament 3 proved to be a no-go for getting PhysX off the ground. While the added effects enhanced the immersiveness in ways PhysX hadn't really done yet in past tacked-on efforts, the game itself wasn't such a commercial and critical success as to carry the technology with it to the great heights NVIDIA aspired towards. Undeterred, NVIDIA pushed forward to the next game, one that was quickly gaining recognition and excitement in the gaming community in a way no other game of its kind had yet.

Batman: Arkham Asylum was to be the first modern Batman game that truly wasn't crappy. With fan favoring decisions like the casting of popular voice actors and choice in villains, the combat system that seemed effortless and seamless, the refined stealth elements and the highly appealing art direction and high-fidelity graphics afforded by the Unreal Engine 3, the game was poised to deliver on all the promises gamers could ever hope for in a comic book hero game. The game was set for simultaneous release on all the gaming platforms worth a damn in this current generation, but in an incredibly risky move, the game was delayed on the PC thanks to NVIDIA convincing the publishers it would be worth it to go all-out on the inclusion of PhysX effects. NVIDIA knew they needed a true, blue killer app for PhysX unlike anything they've had yet; one that both critics and consumers could agree on. So was it worth it?

Just recently my attention was brought to this comparison video showing the differences with and without PhysX. Now I've seen similar videos like this before, but nothing that went into the game with as much depth as this one, and because of that there are quite a few spoilers for the locations and enemies you'll be facing. The results are actually pretty spectacular. Huge clouds of debris swirl around in the air, bouncing off giant enemies and crumbling off of walls and structures. Fog fills rooms and environments, covering the floor and enhancing the spooky factor of some of the areas. Paper, not just a few being kicked around on the floor, but rooms engulfed with it, blowing around, encircling your character and environment in all directions like nothing seen before. Little things like cobwebs, sparks, broken tile, and trash that would have once been considered the best that PhysX has to offer, now play second-fiddle to the gigantic graphical spectacles that I mentioned earlier. PhysX has finally found its killer app, just as NVIDIA has always wanted. This is the game to finally show what it can do, how it can really add to the immersiveness of a game in a way not done before. A purely graphical effect it may be, but like HDR and anti-aliasing before it, it creates a level of realism and a visually captivating experience that sets the bar beyond what's been done thus far. This is the next level of graphics, folks. The next frontier. If you can watch that video and say you'd enjoy the game just as much without those effects, then the world of graphics technology and innovation mean precisely squat to you.

As much as the PhysX additions impress in Batman: AA, I can't help but wonder if it really advances the importance of PhysX and what it can do for the future of gaming, or if it really just shows the potential for GPU-based physics in general. PhysX, as it is right now, is still a proprietary standard supported on just one set of hardware made by one company. Once DX11 takes hold, a similar approach could be done with GPU compute programs supported by that API, and thus supported by all DX11 cards no matter their maker. Intel has a lot to gain by keeping Havok to themselves, so it seems to me that the days of both Havok and PhysX are numbered, and its only a matter of time before a third option, more flexible and more widely supported, rises from obscurity to truly herald a new age in physics processing. One thing's for sure though: GPU-based physics is here to stay, one way or another.