The world of GPUs has always been a cutthroat one, and while no GPU designer ever intends to release a flop, there are instances under the high pressure environment of this market where mistakes are inevitable. A GPU maker has to survive by timely launches of competitive graphics solutions, straining to anticipate their rival’s intentions and often stretched for time against ever-present problems. It’s a complex undertaking, where design takes place four or five years before it will be powering whatever crop of demanding games currently hold our attention, having to understand what the capabilities of manufacturing will be at the time, and the state of software standards like Microsoft’s DirectX. Then when launch time comes, you have to be ready with enough chips hitting close enough to performance goals, and feature-complete drivers that are polished and ready to spit out images error-free.
Showing posts with label AMD. Show all posts
Showing posts with label AMD. Show all posts
Sunday, May 22, 2016
Wednesday, October 17, 2012
AMD's Nervous Breakdown Pt.2
One of the things I've loved most about AMD over the years is their forward-looking processors and platforms. In 2004 I upgraded my old Athlon XP system to a spiffy new Socket 939 Athlon 64, and was pretty happy with the improved performance. I had no idea how well the platform would pay off for me, though, when dual-core processors started to take off shortly after. By 2007 my PC was aging, but with a simple BIOS update, an Athlon 64 X2 was just a drop-in upgrade away from adding another two years of life to the system. I imagine it was a lot like how people felt when they put an i486 OverDrive chip into their 1992-era PCs.
Friday, November 11, 2011
AMD's Nervous Breakdown
So Bulldozer bombed. The biggest change in CPU architecture for AMD since the K7, and their one true hope for finally making up the miles worth of lost ground towards Intel in the processor performance race. Up in smoke.
Oh, it's a disappointment alright. On paper -- and granted, I'm not a whiz at processor architectures -- it sounded pretty darn good. Sure two threads had to share a single FP unit inside one of its modules, but it could do 256-bit vector operations. General consensus is that the design of the chip, from a high level, was sound. But it hinged on something very important: clockspeed. It was a beefier engine, and it needed more cycles to keep it fed, and the end product was simply starved of those. Unless you were following all the leaked benchmarks and performance indicators leading up to its launch, you were shocked. The world was shocked.
Oh, it's a disappointment alright. On paper -- and granted, I'm not a whiz at processor architectures -- it sounded pretty darn good. Sure two threads had to share a single FP unit inside one of its modules, but it could do 256-bit vector operations. General consensus is that the design of the chip, from a high level, was sound. But it hinged on something very important: clockspeed. It was a beefier engine, and it needed more cycles to keep it fed, and the end product was simply starved of those. Unless you were following all the leaked benchmarks and performance indicators leading up to its launch, you were shocked. The world was shocked.
Tuesday, January 25, 2011
Sandy Meh
As expected, Sandy Bridge launched this month, among a flurry of revelatory announcements, news, and rumors within various areas of the tech industry. Sandy Bridge's launch is preceded by much expectation, amidst a string of products made in recent years by a company known for delivering on such expectations. In many ways Sandy Bridge continues on that legacy, but in many ways, it's largely forgettable, certainly the most forgettable since the Core 2 generation first launched.
As a new generation, you expect new features and a new architecture to be introduced, much like the Core 2 was a complete change in philosophy from the Pentium 4, and the first-gen Core i7 was a complete change in cache memory hierarchy, and the first to bring the northbridge on die for a consumer Intel chip. So what does Sandy Bridge bring? Improved IGP and an integrated video encoder. Granted, there are a lot more low-level stuff, but none of them are really revolutionary like the last two generations, and as far as the end-user is concerned, there isn't a whole lot of new stuff to be had. Of course, at some point you have to expect the advances to slow down. There were a lot of big-ticket items Intel needed to get out of the way in the move away from P4, and in the competition against AMD. AMD rarely had the performance crown, but there were plenty of enviable innovations under their heatspreaders, and for the most part, Intel's been spending the last four years copying those ideas while also making them perform at a benchmark level.
As a new generation, you expect new features and a new architecture to be introduced, much like the Core 2 was a complete change in philosophy from the Pentium 4, and the first-gen Core i7 was a complete change in cache memory hierarchy, and the first to bring the northbridge on die for a consumer Intel chip. So what does Sandy Bridge bring? Improved IGP and an integrated video encoder. Granted, there are a lot more low-level stuff, but none of them are really revolutionary like the last two generations, and as far as the end-user is concerned, there isn't a whole lot of new stuff to be had. Of course, at some point you have to expect the advances to slow down. There were a lot of big-ticket items Intel needed to get out of the way in the move away from P4, and in the competition against AMD. AMD rarely had the performance crown, but there were plenty of enviable innovations under their heatspreaders, and for the most part, Intel's been spending the last four years copying those ideas while also making them perform at a benchmark level.
Wednesday, December 15, 2010
Opening a can of...wrist slapping
I put off writing another rambling in anticipation of the upcoming Caymen GPU release, figuring nothing else was happening in the technology world worth writing about as much. Well, guess what? It released.
And it's about damn time too. About a month delayed (plus a NDA lift postponed a few days), we have the 6900 family, a sub-group of the 6000 order tailoring exclusively to the highend portion of the market. AMD can hardly be faulted for a comparatively minor delay next to NVIDIA's pitfalls, but when all is said and done, this amounts more to a swift kick to the shin than an ass whoopin'.
Tuesday, November 16, 2010
Bobcat versus the caterpillar
Previews are out for the new Bobcat platform from AMD, technically called the Brazos platform, but Bobcat is the architecture powering it. It sorta coincides with the Bulldozer architecture powering the Scorpius platform, although in the real world, Bobcat makes bulldozers, not the other way around.


Monday, October 25, 2010
5870 + 1000 naming points
The next generation 6000 series is upon us, and it's another case of fact being less interesting than fiction. Rumors had the chip pegged as re-architected shader setup with a Vec4 layout consisting of more capable and versatile SP units, coupled with an uncore clock domain system ala G80 that would clock the shaders higher than the rest of the chip. They're pretty interesting ideas that make sense in light of being stuck on the previous generation 40nm manufacturing process. If you can't boost transistors, find a way to push those existing transistors harder.
Tuesday, August 31, 2010
Tech Updates
Before I commit the entire month of August to headphone rambling, here's a few things in the tech world I sort of skipped over since the last relevant post.
The biggest I think is AMD ditching the ATI name. A lot of computer websites and myself included mostly referred to ATI products by AMD anyway, but occasionally "ATI" would slip back into consciousness and find its way into our writings whenever the thought arose about what actually produces these products. Well AMD hopes to dispel any such confusion in the future by erasing any question of the source of their graphics solutions, emphasizing the dissolving of any subdivisions into a unified, singular company. To some, this is a startling reminder of the extremity of the event that took place now four years ago, one that carries with it some pretty heavy significance in the history of computing, and in the minds of those who follow the industry. Those folks might have foreseen this eventuality from the outset, but the sudden immediacy of its arrival is no less sobering especially for fans of ATI in years gone by. It seems hard to imagine a Radeon without an ATI, but we must be quick to remind ourselves that it isn't the name that made the Radeon, but the company, the engineers and innovators within that kept an underdog alive, and even gave their opposition a bloody nose from time to time. Now more than ever they're on top of their game, and with NVIDIA struggling to flesh out their product line, their clockwork execution and unwavering onslaught have put them in a position few would have thought possible after the stumbling first couple years following their acquisition.
The biggest I think is AMD ditching the ATI name. A lot of computer websites and myself included mostly referred to ATI products by AMD anyway, but occasionally "ATI" would slip back into consciousness and find its way into our writings whenever the thought arose about what actually produces these products. Well AMD hopes to dispel any such confusion in the future by erasing any question of the source of their graphics solutions, emphasizing the dissolving of any subdivisions into a unified, singular company. To some, this is a startling reminder of the extremity of the event that took place now four years ago, one that carries with it some pretty heavy significance in the history of computing, and in the minds of those who follow the industry. Those folks might have foreseen this eventuality from the outset, but the sudden immediacy of its arrival is no less sobering especially for fans of ATI in years gone by. It seems hard to imagine a Radeon without an ATI, but we must be quick to remind ourselves that it isn't the name that made the Radeon, but the company, the engineers and innovators within that kept an underdog alive, and even gave their opposition a bloody nose from time to time. Now more than ever they're on top of their game, and with NVIDIA struggling to flesh out their product line, their clockwork execution and unwavering onslaught have put them in a position few would have thought possible after the stumbling first couple years following their acquisition.
Thursday, May 6, 2010
AMD Rising
So I'm sitting here with my 5870 churning away, I've gotten rid of the Catalyst Control Panel entirely, updated the drivers, noticed some bugs getting fixed, and I'm just thinking, "you know, this card is pretty damn nice." It may not have had much wow-factor for me, and it may lack some features, but the performance is damn good, the quality is damn good, and as I start to adjust to the new equipment, I'm starting to realize how happy it's making me. AMD's got a good thing going here.
Monday, April 12, 2010
My ATI
Since the launch of the 5870, I was never really impressed by it. Set with the simple goal of doubling everything in the previous flagship, architecturally it left little for the technology enthusiast to scrutinize, taking what we already knew and just expanding it. This resulted in performance bound by limitations instilled in the design of a graphics pipeline dating back several years, and not a particularly forward-looking one even then. Certainly some tweaks were made, mainly to do with caches and registers and low-level things of that nature, but the overall layout remained, itself strictly a product of a general analysis of current-day graphics engines.
Weathering all expectations was the realization of a meager 40% gain over the 4890, a number that's since ballooned to about 58% average under a more modern selection of games and drivers. Clearly a card that was more a victim of the times than anything else; a ceiling imposed by the limited demands of contemporary software.
Weathering all expectations was the realization of a meager 40% gain over the 4890, a number that's since ballooned to about 58% average under a more modern selection of games and drivers. Clearly a card that was more a victim of the times than anything else; a ceiling imposed by the limited demands of contemporary software.
Monday, April 5, 2010
Taiwan Semi-(bad)Conduct
NVIDIA has a problem. It's got a bad chip launch on its hands: it's hot and uses a lot of power, it's huge and yields suck, and it hasn't nearly hit performance targets. Worst of all, they're stuck with that for a whole year before the option for a die shrink shows up to fix all their woes.
TSMC has been promising that 28nm would be well ready by this year. But as reported by The Inq and a very good news article by Xbit Labs, they might barely make it out by the end of this year, with GPU-suitable versions only being ready by early next year. Also mentioned in the Xbit article (and here as well), 32nm has been scrapped, so that leaves NVIDIA no other alternative but to wait, and thus so must we.
TSMC has been promising that 28nm would be well ready by this year. But as reported by The Inq and a very good news article by Xbit Labs, they might barely make it out by the end of this year, with GPU-suitable versions only being ready by early next year. Also mentioned in the Xbit article (and here as well), 32nm has been scrapped, so that leaves NVIDIA no other alternative but to wait, and thus so must we.
Friday, April 2, 2010
FX Reincarnated?
I held off writing another blog post for a couple months just waiting for Fermi. I didn't have much else to write about, and as far as tech-related stuff goes, Fermi was the biggest thing on my mind. I've already said pretty much everything else I'd ever want to say about it, and there wouldn't be anything new to comment on until it finally released. (I did think about writing a rambling about Bioshock 2 but I didn't have anything to say about that that hadn't already been said elsewhere.) Since its release, it was just a matter of setting aside the time to do it.
The last six months have been agonizing. I remember the day the 5870 came out. I had WoW running in the background and a webpage sitting there in front of me with a NewEgg listing of all the 5870s available, all of them $380. I had to will myself not to pull the trigger on one, because NVIDIA might have something much better right around the corner, and it might be stupid not to at least wait and see. Usually in the tech world that's always the best policy, but this is one of the few times I'm kicking myself for not indulging in some impulse buying.
The last six months have been agonizing. I remember the day the 5870 came out. I had WoW running in the background and a webpage sitting there in front of me with a NewEgg listing of all the 5870s available, all of them $380. I had to will myself not to pull the trigger on one, because NVIDIA might have something much better right around the corner, and it might be stupid not to at least wait and see. Usually in the tech world that's always the best policy, but this is one of the few times I'm kicking myself for not indulging in some impulse buying.
Friday, January 8, 2010
Updates, updates...
So CES is this week. Palm launched new stuff. Intel launched new stuff. AMD launched new stuff. More importantly, NVIDIA launched new stuff.
NVIDIA has succeeded again in releasing another SoC that everyone wants. Hopefully they succeed this time at actually delivering it to everyone who wants one. Last time Tegra's only notable design win was the Zune HD, a largely forgettable media player that...well, everyone largely forgot about shortly after its release. But that was all it had. Earlier at the start of this blog I had gushed at the possibilities of its use in smartbooks, only to be disappointed at the close of the year by the absence of said smartbooks. Turns out Mobinnova (and others) was simply waiting for Tegra 2, and for good reason. Packing two out-of-order dual-issue FPU-enabled ARM Cortex A9s, it beats the shit out of Tegra 1. Every demo of a tablet (I guess some are calling those "slate PCs" now) or smartbook using Tegra showed a sluggish running system. The thing was simply not meant for full-sized computing endeavours, and let's face it, we're not even talking full-sized demands here. But Tegra 2 should have no problem handling any Firefox-browsing aspirations, and hell even some HD media and gaming on the side. Cooler still, it's built on 40nm. Usually side products like this--chipsets, bridge chips, NVIO, whatever else NVIDIA makes that's not a GPU--get second class manufacturing, but not this time. I guess it's a sign NVIDIA's really taking this seriously, and if worst comes to worst, I think they're banking on supporting themselves on this little "side product" if at all possible. Apparently they see the mobile SoC market as being worth billions, overshadowing any other market they've ever been in, so it could very well be the next big thing for them. Well, the only other big thing for them aside from GPUs. For now let's hope Tegra 2 makes it into some kickass products that we can actually buy.
NVIDIA has succeeded again in releasing another SoC that everyone wants. Hopefully they succeed this time at actually delivering it to everyone who wants one. Last time Tegra's only notable design win was the Zune HD, a largely forgettable media player that...well, everyone largely forgot about shortly after its release. But that was all it had. Earlier at the start of this blog I had gushed at the possibilities of its use in smartbooks, only to be disappointed at the close of the year by the absence of said smartbooks. Turns out Mobinnova (and others) was simply waiting for Tegra 2, and for good reason. Packing two out-of-order dual-issue FPU-enabled ARM Cortex A9s, it beats the shit out of Tegra 1. Every demo of a tablet (I guess some are calling those "slate PCs" now) or smartbook using Tegra showed a sluggish running system. The thing was simply not meant for full-sized computing endeavours, and let's face it, we're not even talking full-sized demands here. But Tegra 2 should have no problem handling any Firefox-browsing aspirations, and hell even some HD media and gaming on the side. Cooler still, it's built on 40nm. Usually side products like this--chipsets, bridge chips, NVIO, whatever else NVIDIA makes that's not a GPU--get second class manufacturing, but not this time. I guess it's a sign NVIDIA's really taking this seriously, and if worst comes to worst, I think they're banking on supporting themselves on this little "side product" if at all possible. Apparently they see the mobile SoC market as being worth billions, overshadowing any other market they've ever been in, so it could very well be the next big thing for them. Well, the only other big thing for them aside from GPUs. For now let's hope Tegra 2 makes it into some kickass products that we can actually buy.
Labels:
AMD,
ARM,
ATI,
gaming,
graphics cards,
Intel,
netbooks,
NVIDIA,
smartbooks,
smartphones,
tablets,
Tegra
Wednesday, November 18, 2009
Graphics is my favorite subject
So the HD 5970 is out. I like the name. No suffixes whatsoever. Simple, clean, elegant, gets the point across. There's a prefix but that's just to denote the boarder range of GPUs it's a part of. Better than NVIDIA's prefixes, which are really suffixes just moved to the front.
I read a few reviews, and obviously the thing pulverizes the competition, but the competition is a dead horse anyway. Something frustrated me about most of the reviews though: the game selection. It can't be helped, I suppose. Almost all of them are console ports (some with minor enhancements) that never had any problem running in the first place. What's the point benching those games if absolutely no one would be basing their purchasing decision on them? Nobody's thinking "oh man, I need to start doing research for a card that can play Borderlands". Fucking anything can play Borderlands. $100 cards used to be shit but now that'll buy you a 9800GT or equivalent. That's like nothing for a video card budget, and we're talking a card just a smidge under the flagship performance of 2006 (which would normally make it pretty old, but not anymore). So yeah, anything north of toilet paper will run Borderlands, or any of the COD games... or Far Cry 2, or L4D2, or Resident Evil 5, or Batman: Arkham Asylum, or whatever the hell else.
I read a few reviews, and obviously the thing pulverizes the competition, but the competition is a dead horse anyway. Something frustrated me about most of the reviews though: the game selection. It can't be helped, I suppose. Almost all of them are console ports (some with minor enhancements) that never had any problem running in the first place. What's the point benching those games if absolutely no one would be basing their purchasing decision on them? Nobody's thinking "oh man, I need to start doing research for a card that can play Borderlands". Fucking anything can play Borderlands. $100 cards used to be shit but now that'll buy you a 9800GT or equivalent. That's like nothing for a video card budget, and we're talking a card just a smidge under the flagship performance of 2006 (which would normally make it pretty old, but not anymore). So yeah, anything north of toilet paper will run Borderlands, or any of the COD games... or Far Cry 2, or L4D2, or Resident Evil 5, or Batman: Arkham Asylum, or whatever the hell else.
Saturday, October 3, 2009
NVIDIA's Fermi
Fermi is the name of the architecture for NVIDIA's next gen (DX11) cards. Fermi was announced ahead of actual card announcements or even just information about gaming features. All that was talked about, in fact, was Tesla-related shit, but despite that I've read all kinds of bullcrap from people jumping to all kinds of ridiculous conclusions about it.
Once again, this was an announcement for Tesla. Companies looking to make large investments in new servers and HPC systems need a lot of lead time to make decisions, and NVIDIA was trying to appeal to them, as well as investors and stock holders, proving that Fermi is real and that there are some really cool things to look forward to about it. AMD released their shit, so now NVIDIA wants to make some sort of response, even if it isn't actual hardware. This was an announcement to gain mindshare, nothing more.
Once again, this was an announcement for Tesla. Companies looking to make large investments in new servers and HPC systems need a lot of lead time to make decisions, and NVIDIA was trying to appeal to them, as well as investors and stock holders, proving that Fermi is real and that there are some really cool things to look forward to about it. AMD released their shit, so now NVIDIA wants to make some sort of response, even if it isn't actual hardware. This was an announcement to gain mindshare, nothing more.
Monday, July 27, 2009
On the precipice of battle, DX11 approaches
Sheesh, last time I complained it had been ten days since the previous rambling. Now here I am damn near a month later before I'm updating again.
It's almost August. Windows 7 has gone gold. In four months, on October 22 of this year, it'll release to the hungry masses, bringing with it the next DirectX generation, DirectX 11. To correspond with the new API update, new graphics card generations will be released, as they've always done. Since Intel is still at least a year off from debuting the hard launch of Larrabee, that leaves two graphics card companies to think about: AMD and NVIDIA. It has been a long time since the last generation first launched. Both GPU designers, the public, and the press are absolutely aching for the next battle.
It's almost August. Windows 7 has gone gold. In four months, on October 22 of this year, it'll release to the hungry masses, bringing with it the next DirectX generation, DirectX 11. To correspond with the new API update, new graphics card generations will be released, as they've always done. Since Intel is still at least a year off from debuting the hard launch of Larrabee, that leaves two graphics card companies to think about: AMD and NVIDIA. It has been a long time since the last generation first launched. Both GPU designers, the public, and the press are absolutely aching for the next battle.
Saturday, June 6, 2009
NVIDIA cards are overpriced
I think most people who've been following the industry will take one look at that title and think "duh". I admit I haven't been keeping up with graphics card prices lately, because I've been trying hard not to shop for one, even though the 8800GTX I'm using right now doesn't do Crysis enough justice by my standards, and it's because of that I've yet to beat the game, or buy the standalone expansion to it.
What brought on this observation was an investigative article on Anandtech that came across my feeds the other day about GTX 275 overclocking. I felt it was a relevant article to read at the time because shader VS core scaling has been an interesting issue with NVIDIA cards since the G80, and also because I was bored at the time. The article pointed to a preceding article that investigated the same topic with the 4890, and I decided to look over that one as well since I hadn't been keeping up with that card. I was surprised to find that the 4890 actually keeps toe-to-toe with NVIDIA's current fastest single-GPU card the GTX 285, in the most intensive games (read: the only games that matter to those shopping for a new GPU right now). But then it seems you can overclock the 4890 higher than the GTX 285, percentage-wise. Well then I got curious about where they stand price-wise.
What brought on this observation was an investigative article on Anandtech that came across my feeds the other day about GTX 275 overclocking. I felt it was a relevant article to read at the time because shader VS core scaling has been an interesting issue with NVIDIA cards since the G80, and also because I was bored at the time. The article pointed to a preceding article that investigated the same topic with the 4890, and I decided to look over that one as well since I hadn't been keeping up with that card. I was surprised to find that the 4890 actually keeps toe-to-toe with NVIDIA's current fastest single-GPU card the GTX 285, in the most intensive games (read: the only games that matter to those shopping for a new GPU right now). But then it seems you can overclock the 4890 higher than the GTX 285, percentage-wise. Well then I got curious about where they stand price-wise.
Subscribe to:
Posts (Atom)