Friday, November 11, 2011

AMD's Nervous Breakdown

So Bulldozer bombed. The biggest change in CPU architecture for AMD since the K7, and their one true hope for finally making up the miles worth of lost ground towards Intel in the processor performance race. Up in smoke.

Oh, it's a disappointment alright. On paper -- and granted, I'm not a whiz at processor architectures -- it sounded pretty darn good. Sure two threads had to share a single FP unit inside one of its modules, but it could do 256-bit vector operations. General consensus is that the design of the chip, from a high level, was sound. But it hinged on something very important: clockspeed. It was a beefier engine, and it needed more cycles to keep it fed, and the end product was simply starved of those. Unless you were following all the leaked benchmarks and performance indicators leading up to its launch, you were shocked. The world was shocked.

It seems like Rory Read was shocked too. Announced as AMD's new CEO in August of this year, he's recently put into effect mass layoffs in excess of 10% of the company's workforce mere weeks after the first Bulldozer processors debuted. Rumors are circulating on what this might mean for AMD's future goals. Rory has issued an internal memo stating that this is the start of a new strategy for AMD, which many journalists and analyst believe means a shift in focus towards mobile markets, and an emphasis on embedded solutions.

As AnandTech stated, among the many employees let go during the operation was Carrell Killebrew, the Director of Product Planning for AMD's GPU division. Anand points out that this move is peculiar for one supposedly based on cost savings, and indicates a differing of philosophies between Carrell and the company.

"When reducing workforce to cut costs, you don't go after your product planners - unless their vision and your vision don't line up. [...] Carrell's vision saw the continued growth of the high-end GPU."

So suddenly this sounds like AMD may in fact be leaving the highend GPU market, especially since, as XbitLabs reported Jon Peddie as saying, "AMD was far from fat. And they were making a profit. I don't see any need for such a huge cutback."

And if they're leaving the highend GPU market, and actually moving more towards the mobile market, one could speculate that they might abandon the enthusiast market entirely, including CPUs. But we'll leave that one alone for now.

Needless to say, this brings with it a about a thousand different questions pertaining to the consequences of such a decision. The vast majority of the innovations in the GPU market are driven by tough competition, as well as API advances brought about by Microsoft and Khro-...well no, just Microsoft. Without that competition, there won't be any hard push to improve the quality experience for end users (and professionals for that matter). Most large companies aren't at their best without heavy competition, and the result might be similar to what we saw with the sound card market, where innovation gave way to incremental improvements that had little to do with real R&D and more to do with bloated software packages and marketing. The GPU market on the PC is largely a two-horse race, and without that second horse, the other one is free to trod along at its own pace, and unfortunately that means inflated prices.

Of course, without any real advances in real-time rendering technology, developers don't have much incentive to keep pushing the boundaries either. That might have the side effect of allowing them to take better advantage of the hardware that's already available, but it also means we might be close to the end of what we'll see computers capable of in games. This could also mean the end of future console generations without a significantly more powerful GPU to build around.

There's no question that graphics has hit a point of diminishing returns already. With so much investment required to make any truly noticeable changes to visual fidelity, it was an eventuality that we might see the highend GPU market collapse. That's why NVIDIA has started investing so heavily into the mobile market as a windfall for when that happens. Instead of trying to pack more transistors into massive chips as process sizes shrink, we might see those advances used to squeeze today's cutting-edge graphics into smaller packages. Computing in general has been shrinking since its inception, and there's no doubt that devices like tablets and smartphones are the future for the majority of people's computing needs. That means not only providing all the features found on full desktops, but running them at a comparable level.

More mobilization means more consolidation of devices too. Hardware that once required it's own box to house, like DVRs or even game consoles, might find itself crammed inside TVs. It's the age of convenience, after all, and the easier it is to access the things you want, the more likely consumers will eat it up. If the vast majority of what you want to do can be done on a tablet or your TV directly, why buy another device that does only a subsection of those things?

AMD wants to ride this wave, and to do so it's going to need to focus more on its Bobcat architecture, and possibly something even smaller. It can wait for manufacturing processes to continue to scale its existing cores, but what if they were to go a different route? Would AMD rely exclusively on x86, where it's almost always had at a disadvantage to Intel, or turn to ARM instead, and hope to gain a leg-up in efficiency compared to Intel's Atom processors? ARM has the advantage of allowing licensees to either use ARM-designed processors, or to just take their instruction set and put it to use in a unique architecture. It might be to AMD's advantage to broaden their portfolio with ARM technologies and pick up where Intel failed in that regard.

In the meantime, what's going to happen to desktops or even laptop computers? As time goes by, the big hulking desktop tower looks increasingly obsolete sitting next to all the slim computers flourishing in the market. Laptops have made great strides to catch up, already capable of doing almost everything high-performance desktops can. According to Jon Peddie research, discreet GPUs are still holding steady in sales in recent years, but as the PC market continues to grow, GPUs sales aren't growing with it. Integrated graphics, however, is growing fast, and quickly taking over laptops with the introduction of AMD's Fusion and Intel's latest HD graphics processors. On the same die and with the same cutting-edge transistors as those composing CPUs, integrated GPUs are free to take off in performance, leaving low-end GPUs utterly obsolete and more mainstream chips straining to keep apace. It seems like only a matter of time before having a graphics card in any form is rendered unnecessary.

Still, one questions the reasoning behind leaving the enthusiast graphics market when AMD has been doing so well in it. Certainly their prowess in the ultra-mobile field is, to say the least, unproven, and when taking on a new venture, it's probably a good idea to have a back-up plan. So why fire your GPU product planner? AMD still has the Southern Islands family set to debut soon, and probably has a generation after that well in-place already, but in 2014, what comes then?

At this point it's not hard to imagine what would happen if they really did exit the performance GPU market, and if the sales of discreet GPUs doesn't dramatically fall by 2014, perhaps they hope to be the ones to make that happen. We won't know AMD's plans concretely until early next year at their Financial Analyst Day event, and I'm sure consumers, and most of all stockholders will be eagerly listening to whatever method might be behind the looming madness that seems to be on the horizon. But if things materialize the way they appear to be, this could be one of the biggest changes to the PC market landscape in decades. Of course, I've been saying that a lot lately...