Tuesday, December 15, 2009

The Plight of Highend Graphics

This rambling is kind of a response to a blog post by Antony Leather of bit-tech. The question posed is that of the effect of Crysis on the PC gaming industry, more specifically on the consumers whose burden it is cope with the demands of the game on their home computers. Is (or was) it a trivial matter, perhaps even a positive influence on PC gamers and the market, or was it such an unreasonable expectation on the target demographic to affect a movement of consumers away from the PC into the user-friendly arms of the current console generation?

Interestingly, a recent article on Fudzilla uses similar language in its last sentence: "No wonder gamers are turning to consoles in droves." Both entries were published the same day, and it seems unlikely that one influenced the other.

This idea that people are actually leaving the PC in favor of a more simplified gaming experience strikes me as a misguided notion. It would seem to me those who have stayed with PC gaming up to this point and for any reasonable length of time have come to terms with the fact that PC gaming involves certain obstacles and considerations that go beyond that of mainstream gaming. With gaming on a platform of superior visual fidelity, freedom, and control options, the price is that you know a little more about the technical underpinnings of the the hardware and software that you're dealing with, and yes it might require some upgrading sometimes. But Crysis didn't introduce this idea anew to the market of PC gaming. When was the last time a game pushed the envelope to such levels?

Hmmm, the last time I remember something like this happening was... I dunno, Elder Scrolls IV: Oblivion? But hell that wasn't really a stretch to run, was it? It might have put the sweat on the brow of your average flagship graphics card at the time, but there were plenty of other reasons to buy one of those anyway. Before that there was perhaps FEAR. But the really standout year where all hell broke loose on upgrading had to have been 2004. How many people beefed up their systems in preparation for Doom 3, or Half-Life 2? Hell even Far Cry pushed the limit of what computers could do back then. It was one game after another that really required additional expenditure on the hardware we were running. I don't remember a lot of people complaining about that though, in fact most people were happy to upgrade, if it meant ushering in a new era in graphics and gameplay.

But then you might say, "yes but the argument is that Crysis couldn't be satisfied. There was nothing that could max it out at the time, or even a couple years after its release!" Well that's true. I can think of a couple other games similar to that, where their graphics settings were created with the future in mind, beyond what current systems were capable of, namely the Everquest series. But it seems rather funny to me that PC gamers would be high-tailing it in droves over one little game that makes unreasonable demands on current-day technology, even though that was the expressed intention of it, and to compromise with slightly lesser settings was hardly compromising in the grand scheme of things, when "medium" looked better than anything else out there. Furthermore in 2007 and especially the following year, the price for really good performance graphics took a nosedive, to a level hardly seen since the Radeon 9500 Pro and dare-I-say the Voodoo days of old, with the 8800GT and 3850 cards released that generation. And if you did pony up for the most expensive model, that model (the 8800GTX) kept its value for much longer than virtually any card before it, unflinching from its $600 price tag for at least a year after it launched. Spending money on an upgrade at the time, even if it was for just one game, hardly seemed to be much of an sacrifice, especially when taken into the context of previous years.

And what else was pushing the envelope like Crysis did? Nothing, that's what. You might get a European game here and there (Stalker) that used some of the newer bells and whistles, but the vast majority of the PC gaming library since the launch of the current generation of consoles has been games that could run on the most modestly-priced of graphics cards. It has never been cheaper and easier to be a PC gamer, so why stop now? Are PC gamers such ninnies as to be disheartened that easily, where all the other advantages and merits of the platform are quickly forgotten over one instance of frustration?

If anything, we should be thanking Crysis for doing us a favor. PCs used to be the centerfold for innovation and cutting-edge technology, but no one seems to be interested in that anymore. Here comes a game that shows what PCs are really made of, taking us back to a time when no other machine could do what a home computer was able to accomplish. How many people dropped their jaws when they first saw Doom? A handful of us remember those days, and retain a passion and love for the technology behind the game as much as the game itself. If it weren't for Crysis, what would those people have to look forward to?