The next generation 6000 series is upon us, and it's another case of fact being less interesting than fiction. Rumors had the chip pegged as re-architected shader setup with a Vec4 layout consisting of more capable and versatile SP units, coupled with an uncore clock domain system ala G80 that would clock the shaders higher than the rest of the chip. They're pretty interesting ideas that make sense in light of being stuck on the previous generation 40nm manufacturing process. If you can't boost transistors, find a way to push those existing transistors harder.
Instead AMD chose to trim off some transistors, and just make a more economical chip. They called it "rebalancing" the architecture, as apparently game graphics stagnating with the success of the consoles doesn't demand as much horsepower as AMD anticipated, and other than a revamped tessellation unit to save face after all their gloating over their experience with geometry subdivision, much remains the same. With less SPs, less texture units, but a slightly higher clockspeed, AMD ended up with a marginally slower chip than their last generation single-GPU flagship, but with a significantly smaller size. Instead of getting a highend card first, we're getting an upper midrange chip to replace the old $400 flagship, with the real highend replacement still pending. Mum's the word on the specs of that card, but preliminary rumors and benchmarks show significant improvements, so we're not stuck with blue balls this year thankfully.
The only thing screwed up is the naming scheme, and that's been the general consensus everywhere you look. While not on the level of NVIDIA's shameless botchery of the past, it still introduces confusion in the market, and coming from a company that started probably the best naming convention in graphics card history, it's rather disappointing. The x900 series, while previously relegated to dual-chip SKUs, now apparently denotes cards $300 plus, while cards in the x800 series go back to their $250 roots. That's fine and all, except when people look at the 6870 and compare it to the 5870, they're not looking at a faster card despite the name, and similarly when the 6970 comes out, it will not be faster than the 5970.
On top of that rumors are building full steam on the GTX 580, the successor to the massively disappointing GF100 flagship. So far murmurs suggest a take two on what they were trying to do originally, essentially a GTX 480 "The Way it was Meant to be Made". Looking back, a historical analogy comes to mind from when the FX 5800 Ultra debuted, let everyone down, and then a couple years later a midrange 6600GT comes out with the same clocks, same number of units, but fantastic performance. Had the 5800 Ultra simply had the holes in its design plugged up, I imagine something very similar to the 6600GT would have resulted. The 5800 Ultra was certainly intended to be a beast, and if it had realized its potential, it not only would have decimated the competition but remained relevant with the times a good few years after its launch. But the 5800 Ultra was truly designed for the games of its era, and when the next generation of games started rolling out, the weaknesses of the architecture began to surface. Since then, NVIDIA has planned out their designs with the future in mind, a strategy that has paid off for them until Fermi, when forward looking became presumption, and all the emphasis on GPGPU became a major hindrance.
Supposedly Fermi is doing well in HPC fields where some might say it prioritized, and that's great, I'm happy for them. The problem is, gaming is still their bread and butter, and 500 prototypes of Tegra tablets don't add up to a business until they become shipping products, and CUDA doesn't matter to gamers unless it makes games more fun, which PhysX failed to do, and then of course the nForce chipset, once considered a household name, is now a distant memory. NVIDIA needs to pull their shit together really soon. The 400 series didn't completely flesh out until just this month, while AMD did the same thing within six months of the launch of their first DX11 card. The halo part, the pièce de résistance of the family flopped on its face for the most part, and if the GTX 580 encounters any similar problems, it'll be the start of a really bad trend that might lead to the deterioration of the company as a whole. I'm seriously concerned that NVIDIA, or at least its leadership might be diluting itself, because if you make bold claims like your new mobile chip venture will be a billion dollar revenue deliverer in a short amount of time, and instead of slowly easing yourself into that new direction you just suddenly jump tracks and forget the markets that made you, you might just end up the next Creative Labs. It wouldn't be so bad if AMD wasn't sitting there making graphics cards for gamers that are economical, quick to market, and extremely competitive. Get your priorities straight NVIDIA so you're still around in ten years!