Showing posts with label Intel. Show all posts
Showing posts with label Intel. Show all posts
Wednesday, October 17, 2012
AMD's Nervous Breakdown Pt.2
One of the things I've loved most about AMD over the years is their forward-looking processors and platforms. In 2004 I upgraded my old Athlon XP system to a spiffy new Socket 939 Athlon 64, and was pretty happy with the improved performance. I had no idea how well the platform would pay off for me, though, when dual-core processors started to take off shortly after. By 2007 my PC was aging, but with a simple BIOS update, an Athlon 64 X2 was just a drop-in upgrade away from adding another two years of life to the system. I imagine it was a lot like how people felt when they put an i486 OverDrive chip into their 1992-era PCs.
Friday, November 11, 2011
AMD's Nervous Breakdown
So Bulldozer bombed. The biggest change in CPU architecture for AMD since the K7, and their one true hope for finally making up the miles worth of lost ground towards Intel in the processor performance race. Up in smoke.
Oh, it's a disappointment alright. On paper -- and granted, I'm not a whiz at processor architectures -- it sounded pretty darn good. Sure two threads had to share a single FP unit inside one of its modules, but it could do 256-bit vector operations. General consensus is that the design of the chip, from a high level, was sound. But it hinged on something very important: clockspeed. It was a beefier engine, and it needed more cycles to keep it fed, and the end product was simply starved of those. Unless you were following all the leaked benchmarks and performance indicators leading up to its launch, you were shocked. The world was shocked.
Oh, it's a disappointment alright. On paper -- and granted, I'm not a whiz at processor architectures -- it sounded pretty darn good. Sure two threads had to share a single FP unit inside one of its modules, but it could do 256-bit vector operations. General consensus is that the design of the chip, from a high level, was sound. But it hinged on something very important: clockspeed. It was a beefier engine, and it needed more cycles to keep it fed, and the end product was simply starved of those. Unless you were following all the leaked benchmarks and performance indicators leading up to its launch, you were shocked. The world was shocked.
Tuesday, January 25, 2011
Sandy Meh
As expected, Sandy Bridge launched this month, among a flurry of revelatory announcements, news, and rumors within various areas of the tech industry. Sandy Bridge's launch is preceded by much expectation, amidst a string of products made in recent years by a company known for delivering on such expectations. In many ways Sandy Bridge continues on that legacy, but in many ways, it's largely forgettable, certainly the most forgettable since the Core 2 generation first launched.
As a new generation, you expect new features and a new architecture to be introduced, much like the Core 2 was a complete change in philosophy from the Pentium 4, and the first-gen Core i7 was a complete change in cache memory hierarchy, and the first to bring the northbridge on die for a consumer Intel chip. So what does Sandy Bridge bring? Improved IGP and an integrated video encoder. Granted, there are a lot more low-level stuff, but none of them are really revolutionary like the last two generations, and as far as the end-user is concerned, there isn't a whole lot of new stuff to be had. Of course, at some point you have to expect the advances to slow down. There were a lot of big-ticket items Intel needed to get out of the way in the move away from P4, and in the competition against AMD. AMD rarely had the performance crown, but there were plenty of enviable innovations under their heatspreaders, and for the most part, Intel's been spending the last four years copying those ideas while also making them perform at a benchmark level.
As a new generation, you expect new features and a new architecture to be introduced, much like the Core 2 was a complete change in philosophy from the Pentium 4, and the first-gen Core i7 was a complete change in cache memory hierarchy, and the first to bring the northbridge on die for a consumer Intel chip. So what does Sandy Bridge bring? Improved IGP and an integrated video encoder. Granted, there are a lot more low-level stuff, but none of them are really revolutionary like the last two generations, and as far as the end-user is concerned, there isn't a whole lot of new stuff to be had. Of course, at some point you have to expect the advances to slow down. There were a lot of big-ticket items Intel needed to get out of the way in the move away from P4, and in the competition against AMD. AMD rarely had the performance crown, but there were plenty of enviable innovations under their heatspreaders, and for the most part, Intel's been spending the last four years copying those ideas while also making them perform at a benchmark level.
Tuesday, January 11, 2011
ARMed for a Revolution
Back in 2009 I first wrote about ARM in a rambling I titled ARM's Ascension in which I talked about the rising aspirations and potential of ARM processors in the general computing field. A lot of what I said still rings true, and a lot of what seemed apparent in the future of the market back then is now known to not be true anymore. Smartbooks were prototyped many times, but never made it into shipping products. Instead what happened was the iPad.
iPad ended up doing exactly what many other Apple products have done in the past. When Apple entered the portable media player market, it flourished. When they entered the smartphone market, it flourished. Now that they've entered the tablet market, or maybe better said, initialized the tablet market, that market is set to flourish also. At the forefront of this new emerging form factor is ARM. No matter what SoC your product is using, be it Apple A4, Qualcomm Snapdragon, Samsung Hummingbird, or NVIDIA Tegra 2, ARM lies in the center of it. From the get-go it seems ARM has an iron grip on the market, leaving competitors, namely Intel, with a cliff face of an uphill climb if they want in on it.
iPad ended up doing exactly what many other Apple products have done in the past. When Apple entered the portable media player market, it flourished. When they entered the smartphone market, it flourished. Now that they've entered the tablet market, or maybe better said, initialized the tablet market, that market is set to flourish also. At the forefront of this new emerging form factor is ARM. No matter what SoC your product is using, be it Apple A4, Qualcomm Snapdragon, Samsung Hummingbird, or NVIDIA Tegra 2, ARM lies in the center of it. From the get-go it seems ARM has an iron grip on the market, leaving competitors, namely Intel, with a cliff face of an uphill climb if they want in on it.
Tuesday, November 16, 2010
Bobcat versus the caterpillar
Previews are out for the new Bobcat platform from AMD, technically called the Brazos platform, but Bobcat is the architecture powering it. It sorta coincides with the Bulldozer architecture powering the Scorpius platform, although in the real world, Bobcat makes bulldozers, not the other way around.


Tuesday, August 31, 2010
Tech Updates
Before I commit the entire month of August to headphone rambling, here's a few things in the tech world I sort of skipped over since the last relevant post.
The biggest I think is AMD ditching the ATI name. A lot of computer websites and myself included mostly referred to ATI products by AMD anyway, but occasionally "ATI" would slip back into consciousness and find its way into our writings whenever the thought arose about what actually produces these products. Well AMD hopes to dispel any such confusion in the future by erasing any question of the source of their graphics solutions, emphasizing the dissolving of any subdivisions into a unified, singular company. To some, this is a startling reminder of the extremity of the event that took place now four years ago, one that carries with it some pretty heavy significance in the history of computing, and in the minds of those who follow the industry. Those folks might have foreseen this eventuality from the outset, but the sudden immediacy of its arrival is no less sobering especially for fans of ATI in years gone by. It seems hard to imagine a Radeon without an ATI, but we must be quick to remind ourselves that it isn't the name that made the Radeon, but the company, the engineers and innovators within that kept an underdog alive, and even gave their opposition a bloody nose from time to time. Now more than ever they're on top of their game, and with NVIDIA struggling to flesh out their product line, their clockwork execution and unwavering onslaught have put them in a position few would have thought possible after the stumbling first couple years following their acquisition.
The biggest I think is AMD ditching the ATI name. A lot of computer websites and myself included mostly referred to ATI products by AMD anyway, but occasionally "ATI" would slip back into consciousness and find its way into our writings whenever the thought arose about what actually produces these products. Well AMD hopes to dispel any such confusion in the future by erasing any question of the source of their graphics solutions, emphasizing the dissolving of any subdivisions into a unified, singular company. To some, this is a startling reminder of the extremity of the event that took place now four years ago, one that carries with it some pretty heavy significance in the history of computing, and in the minds of those who follow the industry. Those folks might have foreseen this eventuality from the outset, but the sudden immediacy of its arrival is no less sobering especially for fans of ATI in years gone by. It seems hard to imagine a Radeon without an ATI, but we must be quick to remind ourselves that it isn't the name that made the Radeon, but the company, the engineers and innovators within that kept an underdog alive, and even gave their opposition a bloody nose from time to time. Now more than ever they're on top of their game, and with NVIDIA struggling to flesh out their product line, their clockwork execution and unwavering onslaught have put them in a position few would have thought possible after the stumbling first couple years following their acquisition.
Sunday, June 13, 2010
M11x Revisited
I had briefly mentioned the M11x once before when Alienware first announced it, noting it was an interesting product. Then shortly after its debut, NVIDIA launched Optimus, which is the leader of the Automatic power-saving faction fighting the evil forces of the Deceptively power-hungry opposition. What the power savings actually are I'm not too sure, but I think the main point is that it's about twice as convenient as the manual switching types from before, and it'll be getting all the driver support going forward. So current owners of manual switching discreet graphics laptops are pretty much screwed, and that includes early adopters of the M11x.
It was only natural that people started hoping for an updated M11x that supported the new technology, and while they're at it, updated to the new Core i-series CULV CPUs, and maybe given some tweaks to the aesthetics. But of course there was some fear that we'd have to wait until the next wave of products to see such changes, if they ever came at all. It seemed silly to release the M11x as it was, so close to the introduction of Optimus technology, when surely they must have been informed by NVIDIA ahead of time of its approach. A lot of tech companies will do that to ensure early adoption of new products. It reminds me of a time several years ago when Alienware put a lot of R&D into making a custom multi-GPU graphics system, complete with custom motherboards with multiple AGP slots and software hacks, touting it as the return of SLI. Then about a year later NVIDIA announced the actual return of SLI, coinciding with the launch of PCI Express. Alienware just has a history of doing things at just the wrong time.
It was only natural that people started hoping for an updated M11x that supported the new technology, and while they're at it, updated to the new Core i-series CULV CPUs, and maybe given some tweaks to the aesthetics. But of course there was some fear that we'd have to wait until the next wave of products to see such changes, if they ever came at all. It seemed silly to release the M11x as it was, so close to the introduction of Optimus technology, when surely they must have been informed by NVIDIA ahead of time of its approach. A lot of tech companies will do that to ensure early adoption of new products. It reminds me of a time several years ago when Alienware put a lot of R&D into making a custom multi-GPU graphics system, complete with custom motherboards with multiple AGP slots and software hacks, touting it as the return of SLI. Then about a year later NVIDIA announced the actual return of SLI, coinciding with the launch of PCI Express. Alienware just has a history of doing things at just the wrong time.
Thursday, May 6, 2010
AMD Rising
So I'm sitting here with my 5870 churning away, I've gotten rid of the Catalyst Control Panel entirely, updated the drivers, noticed some bugs getting fixed, and I'm just thinking, "you know, this card is pretty damn nice." It may not have had much wow-factor for me, and it may lack some features, but the performance is damn good, the quality is damn good, and as I start to adjust to the new equipment, I'm starting to realize how happy it's making me. AMD's got a good thing going here.
Friday, January 8, 2010
Updates, updates...
So CES is this week. Palm launched new stuff. Intel launched new stuff. AMD launched new stuff. More importantly, NVIDIA launched new stuff.
NVIDIA has succeeded again in releasing another SoC that everyone wants. Hopefully they succeed this time at actually delivering it to everyone who wants one. Last time Tegra's only notable design win was the Zune HD, a largely forgettable media player that...well, everyone largely forgot about shortly after its release. But that was all it had. Earlier at the start of this blog I had gushed at the possibilities of its use in smartbooks, only to be disappointed at the close of the year by the absence of said smartbooks. Turns out Mobinnova (and others) was simply waiting for Tegra 2, and for good reason. Packing two out-of-order dual-issue FPU-enabled ARM Cortex A9s, it beats the shit out of Tegra 1. Every demo of a tablet (I guess some are calling those "slate PCs" now) or smartbook using Tegra showed a sluggish running system. The thing was simply not meant for full-sized computing endeavours, and let's face it, we're not even talking full-sized demands here. But Tegra 2 should have no problem handling any Firefox-browsing aspirations, and hell even some HD media and gaming on the side. Cooler still, it's built on 40nm. Usually side products like this--chipsets, bridge chips, NVIO, whatever else NVIDIA makes that's not a GPU--get second class manufacturing, but not this time. I guess it's a sign NVIDIA's really taking this seriously, and if worst comes to worst, I think they're banking on supporting themselves on this little "side product" if at all possible. Apparently they see the mobile SoC market as being worth billions, overshadowing any other market they've ever been in, so it could very well be the next big thing for them. Well, the only other big thing for them aside from GPUs. For now let's hope Tegra 2 makes it into some kickass products that we can actually buy.
NVIDIA has succeeded again in releasing another SoC that everyone wants. Hopefully they succeed this time at actually delivering it to everyone who wants one. Last time Tegra's only notable design win was the Zune HD, a largely forgettable media player that...well, everyone largely forgot about shortly after its release. But that was all it had. Earlier at the start of this blog I had gushed at the possibilities of its use in smartbooks, only to be disappointed at the close of the year by the absence of said smartbooks. Turns out Mobinnova (and others) was simply waiting for Tegra 2, and for good reason. Packing two out-of-order dual-issue FPU-enabled ARM Cortex A9s, it beats the shit out of Tegra 1. Every demo of a tablet (I guess some are calling those "slate PCs" now) or smartbook using Tegra showed a sluggish running system. The thing was simply not meant for full-sized computing endeavours, and let's face it, we're not even talking full-sized demands here. But Tegra 2 should have no problem handling any Firefox-browsing aspirations, and hell even some HD media and gaming on the side. Cooler still, it's built on 40nm. Usually side products like this--chipsets, bridge chips, NVIO, whatever else NVIDIA makes that's not a GPU--get second class manufacturing, but not this time. I guess it's a sign NVIDIA's really taking this seriously, and if worst comes to worst, I think they're banking on supporting themselves on this little "side product" if at all possible. Apparently they see the mobile SoC market as being worth billions, overshadowing any other market they've ever been in, so it could very well be the next big thing for them. Well, the only other big thing for them aside from GPUs. For now let's hope Tegra 2 makes it into some kickass products that we can actually buy.
Labels:
AMD,
ARM,
ATI,
gaming,
graphics cards,
Intel,
netbooks,
NVIDIA,
smartbooks,
smartphones,
tablets,
Tegra
Friday, October 23, 2009
ARM's Ascension
NVIDIA hopes to grow their Tegra business to eventually make up 50% of their revenue. By scoring a win with the Zune HD, possibly ending up in the future Nintendo handheld and Apple products, and countless other media, phone, and computing devices, it's no wonder why their expectations might be high. SoCs have always been very popular in the ultra-portable scene, and Tegra is among many leading the way for the future of this technology sector. With hardware accelerated flash, video, graphics and audio support, the capabilities of such SoCs has grown to the point of surpassing the form-factor of just smartphones, to encompass a vast array of devices extending all the way up to notebook-like devices, dubbed "smartbooks".
It's for this reason that ARM is becoming better positioned to take the computing world by storm in the near future. With their recent partnership with the newly formed GlobalFoundries manufacturing company, it's clear they intend on increasing the capabilities of their chips beyond the scope of what they're best known for today.
It's for this reason that ARM is becoming better positioned to take the computing world by storm in the near future. With their recent partnership with the newly formed GlobalFoundries manufacturing company, it's clear they intend on increasing the capabilities of their chips beyond the scope of what they're best known for today.
Thursday, June 4, 2009
A busy week
Everyday when I wake up I turn on my computer and look at my feeds. There is a modest twelve subscriptions right now, and I find going through those on an average day to be laborious enough. I do it though to stay on top of things, because I've found that if there's one thing I enjoy, it's staying up-to-date with all the tech.
This week however has been an especially difficult one as far as that task goes. Two very major events are going on in fields I'm especially interested in: Computex, and E3. A lot of interesting gadgets and games have been revealed as a result of these two shindigs, but one device that I keep coming back to is this little thing.
This week however has been an especially difficult one as far as that task goes. Two very major events are going on in fields I'm especially interested in: Computex, and E3. A lot of interesting gadgets and games have been revealed as a result of these two shindigs, but one device that I keep coming back to is this little thing.
Subscribe to:
Posts (Atom)