Wednesday, December 30, 2009

I don't like L4D

Left 4 Dead does a lot of things right. In fact it nailed the zombie survival idea in gaming form perfectly, most would say. I haven't really watched any zombie movies, but I understand their appeal, and I "get it" when I see all the film-like details added to the game series, such as the film grain, the movie posters on the load screen, and the end credits. There might be specific film references in the levels themselves and those I wouldn't be able to catch for obvious reasons. I recognize the fact that for what the L4D franchise sets out to do, it accomplishes brilliantly, in a way no other game has been able to do, almost creating a sub-genre all on its own.

Tuesday, December 15, 2009

The Plight of Highend Graphics

This rambling is kind of a response to a blog post by Antony Leather of bit-tech. The question posed is that of the effect of Crysis on the PC gaming industry, more specifically on the consumers whose burden it is cope with the demands of the game on their home computers. Is (or was) it a trivial matter, perhaps even a positive influence on PC gamers and the market, or was it such an unreasonable expectation on the target demographic to affect a movement of consumers away from the PC into the user-friendly arms of the current console generation?

Interestingly, a recent article on Fudzilla uses similar language in its last sentence: "No wonder gamers are turning to consoles in droves." Both entries were published the same day, and it seems unlikely that one influenced the other.

This idea that people are actually leaving the PC in favor of a more simplified gaming experience strikes me as a misguided notion. It would seem to me those who have stayed with PC gaming up to this point and for any reasonable length of time have come to terms with the fact that PC gaming involves certain obstacles and considerations that go beyond that of mainstream gaming. With gaming on a platform of superior visual fidelity, freedom, and control options, the price is that you know a little more about the technical underpinnings of the the hardware and software that you're dealing with, and yes it might require some upgrading sometimes. But Crysis didn't introduce this idea anew to the market of PC gaming. When was the last time a game pushed the envelope to such levels?

Wednesday, November 18, 2009

Graphics is my favorite subject

So the HD 5970 is out. I like the name. No suffixes whatsoever. Simple, clean, elegant, gets the point across. There's a prefix but that's just to denote the boarder range of GPUs it's a part of. Better than NVIDIA's prefixes, which are really suffixes just moved to the front.

I read a few reviews, and obviously the thing pulverizes the competition, but the competition is a dead horse anyway. Something frustrated me about most of the reviews though: the game selection. It can't be helped, I suppose. Almost all of them are console ports (some with minor enhancements) that never had any problem running in the first place. What's the point benching those games if absolutely no one would be basing their purchasing decision on them? Nobody's thinking "oh man, I need to start doing research for a card that can play Borderlands". Fucking anything can play Borderlands. $100 cards used to be shit but now that'll buy you a 9800GT or equivalent. That's like nothing for a video card budget, and we're talking a card just a smidge under the flagship performance of 2006 (which would normally make it pretty old, but not anymore). So yeah, anything north of toilet paper will run Borderlands, or any of the COD games... or Far Cry 2, or L4D2, or Resident Evil 5, or Batman: Arkham Asylum, or whatever the hell else.

Friday, October 23, 2009

ARM's Ascension

NVIDIA hopes to grow their Tegra business to eventually make up 50% of their revenue. By scoring a win with the Zune HD, possibly ending up in the future Nintendo handheld and Apple products, and countless other media, phone, and computing devices, it's no wonder why their expectations might be high. SoCs have always been very popular in the ultra-portable scene, and Tegra is among many leading the way for the future of this technology sector. With hardware accelerated flash, video, graphics and audio support, the capabilities of such SoCs has grown to the point of surpassing the form-factor of just smartphones, to encompass a vast array of devices extending all the way up to notebook-like devices, dubbed "smartbooks".

It's for this reason that ARM is becoming better positioned to take the computing world by storm in the near future. With their recent partnership with the newly formed GlobalFoundries manufacturing company, it's clear they intend on increasing the capabilities of their chips beyond the scope of what they're best known for today.

Saturday, October 3, 2009

NVIDIA's Fermi

Fermi is the name of the architecture for NVIDIA's next gen (DX11) cards. Fermi was announced ahead of actual card announcements or even just information about gaming features. All that was talked about, in fact, was Tesla-related shit, but despite that I've read all kinds of bullcrap from people jumping to all kinds of ridiculous conclusions about it.

Once again, this was an announcement for Tesla. Companies looking to make large investments in new servers and HPC systems need a lot of lead time to make decisions, and NVIDIA was trying to appeal to them, as well as investors and stock holders, proving that Fermi is real and that there are some really cool things to look forward to about it. AMD released their shit, so now NVIDIA wants to make some sort of response, even if it isn't actual hardware. This was an announcement to gain mindshare, nothing more.

Monday, September 21, 2009

NVIDIA PhysX

PhysX has always been met with skepticism from all sides of the gaming industry. From its roots as a $250 add-in card to its current incarnation as the forefront of general purpose GPU processing for use in games, the value proposition has been one that few have been able to grasp despite all the promises and possibilities.

Monday, July 27, 2009

On the precipice of battle, DX11 approaches

Sheesh, last time I complained it had been ten days since the previous rambling. Now here I am damn near a month later before I'm updating again.

It's almost August. Windows 7 has gone gold. In four months, on October 22 of this year, it'll release to the hungry masses, bringing with it the next DirectX generation, DirectX 11. To correspond with the new API update, new graphics card generations will be released, as they've always done. Since Intel is still at least a year off from debuting the hard launch of Larrabee, that leaves two graphics card companies to think about: AMD and NVIDIA. It has been a long time since the last generation first launched. Both GPU designers, the public, and the press are absolutely aching for the next battle.

Monday, June 29, 2009

I love Amazon Prime

You know there's a lot more confessing my love of things and a lot less bitching than I expected to do in this blog, but oh well, appreciating things sometimes is good for the soul. It's been ten days since my last post, and I have to write about something.

Wednesday, June 17, 2009

The internet is a curse and a blessing

I've been tempted lately to write something not related to technology, but instead more personal, which is something I promised myself I'd never do on a blog-like site. Since I've never really explained my reasoning here, I'll briefly say that it stems from the public nature of blogs and the private nature of things that happen in your personal life, and the inappropriate mixing of the two for what I can only imagine are questionable motives. Journals and diaries were always meant to be private, and if you have to do any sort of suppressing of your thoughts and feelings because you're mindful of prying eyes, you're not dealing with those issues. But I digress as this isn't a problem with the internet I wish to address today.

The problem I wish to address relates more to the speedy nature of the internet, and the endless fountain of information accessible through it. The internet is like the fast food restaurant chains and the introduction of the automobile before it: it allows us to get what we seek faster than ever before. I was watching TCM this evening and it had a short segment on the introduction of cars, then began to play the Orson Welles flick The Magnificent Ambersons. In both the short and the beginning of the movie, they address the growing pace of life, and how, seemingly miraculously, people seemed to have time to do just about anything in the old days despite having a slower time getting to them. Picnicks, visiting, tea parties, and my mind immediately started to add things: reading, writing, exercising....living. Things I have more of a problem with personally, so the nature of this post is very irregular for this blog, but I find the need to talk about the negative effects the internet has had in taking up my free time and leaving me with pretty much nothing with which to do things I keep putting off, rather than just taking the time to finally do them.

Monday, June 8, 2009

I love Blu-Ray

That really says it all. I'm really not historically a big spender when it comes to media, typically adopting new standards late in the game, but with Blu-Ray (apparently officially abbreviated "BD", even though I hate that) I find I'm suddenly drawn to rabid consumerism with a level of ferocity not wholly apparent with any other medium in the past. Since buying a BD-ROM drive for my computer and thus correspondingly acquiring my first BD movie player, I've been hitting the format hard, getting every film available for it I most desire. Since I got the drive in late-April, I've purchased about 17 movies (I say "about" because The Ultimate Matrix Collection has The Animatrix and feature-length Matrix Revisted which could practically count as movies on their own), which may not sound like much since I'm quickly approaching the two-month-mark but given my lowly income and the fact that I felt the need during that intervening time to upgrade my monitor almost solely to accommodate the demanding BD image quality, I find it to be an astounding pace for someone like me.

Saturday, June 6, 2009

NVIDIA cards are overpriced

I think most people who've been following the industry will take one look at that title and think "duh". I admit I haven't been keeping up with graphics card prices lately, because I've been trying hard not to shop for one, even though the 8800GTX I'm using right now doesn't do Crysis enough justice by my standards, and it's because of that I've yet to beat the game, or buy the standalone expansion to it.

What brought on this observation was an investigative article on Anandtech that came across my feeds the other day about GTX 275 overclocking. I felt it was a relevant article to read at the time because shader VS core scaling has been an interesting issue with NVIDIA cards since the G80, and also because I was bored at the time. The article pointed to a preceding article that investigated the same topic with the 4890, and I decided to look over that one as well since I hadn't been keeping up with that card. I was surprised to find that the 4890 actually keeps toe-to-toe with NVIDIA's current fastest single-GPU card the GTX 285, in the most intensive games (read: the only games that matter to those shopping for a new GPU right now). But then it seems you can overclock the 4890 higher than the GTX 285, percentage-wise. Well then I got curious about where they stand price-wise.

Thursday, June 4, 2009

A busy week

Everyday when I wake up I turn on my computer and look at my feeds. There is a modest twelve subscriptions right now, and I find going through those on an average day to be laborious enough. I do it though to stay on top of things, because I've found that if there's one thing I enjoy, it's staying up-to-date with all the tech.

This week however has been an especially difficult one as far as that task goes. Two very major events are going on in fields I'm especially interested in: Computex, and E3. A lot of interesting gadgets and games have been revealed as a result of these two shindigs, but one device that I keep coming back to is this little thing.

Wednesday, June 3, 2009

A new home

Long ago, there was a page of the utmost simplicity in which I formed on the premise that I could say whatever I wanted and didn't care to hoot who saw it. It was a webpage, I adored it, and it adored me. We lived happily together for probably over a year. It was simple, not because I'm simple, but because my skills at the coding of HTML were of such limited breadth that I could not strain for complexity out of the necessity of keeping things enjoyable for myself. I made articles for it, resembling blog posts, but with such overdrawn length and crude language that I preferred to call them ramblings. I generally confined my topics to those relating to the tech industry. I lamented blogs for their personal nature, and the idea of publishing such personal insights to the view of the public with the purpose of having others read them. I was content with the way things were, and updated quite infrequently at the pace of leisurely detachment. Then, all at once that page was gone. Attempts to unearth any remnants from Google's colossal archive of cached webpages proved fruitless. There was no hope of it coming back. My host had booted me out, and though it was free, I cursed at thee, and mourned the passing of my private little nook on the internet that I had all to myself.

Now, I find my fingers aching for the keys again, and a platform to launch the fruits of their rhythmic typing which is as clean and private as my old webpage was. At first, I turned to the network that fuels much of my wasted time on the internet, IGN. They have a blogging system, but it's of such a simple and cluttered nature as to be rendered woefully unappealing. So now I turn to this one. The most popular blog site there is. It is here I will dump all forthcoming run-on sentences, cheesy adjective-chaining, and overall brutishly executed grammatical stylings that I can squeeze from the lowest bowels of my intellect. I am now, in fact, the biggest fucking hypocrit I could ever be.