Showing posts with label NVIDIA. Show all posts
Showing posts with label NVIDIA. Show all posts

Sunday, May 22, 2016

Top 10 Most Infamous Graphics Cards of All Time

The world of GPUs has always been a cutthroat one, and while no GPU designer ever intends to release a flop, there are instances under the high pressure environment of this market where mistakes are inevitable. A GPU maker has to survive by timely launches of competitive graphics solutions, straining to anticipate their rival’s intentions and often stretched for time against ever-present problems. It’s a complex undertaking, where design takes place four or five years before it will be powering whatever crop of demanding games currently hold our attention, having to understand what the capabilities of manufacturing will be at the time, and the state of software standards like Microsoft’s DirectX. Then when launch time comes, you have to be ready with enough chips hitting close enough to performance goals, and feature-complete drivers that are polished and ready to spit out images error-free.

Friday, May 18, 2012

Consoles in the Cloud

I meant to write about cloud gaming a long time ago. Actually all the way back to when OnLive first debuted. Back then, most people didn't think it was possible. Gamers who made a living managing networks would argue tirelessly on forums about how it wasn't physically possible. The latency would be too great, the image quality would suck ass, and it wouldn't be cost-effective. To their credit, they were partially right, except for the whole impossible thing.

So OnLive debuted, and it was actually shockingly new technology, enough so to make jaded technology buffs gape in awe. No, the latency wouldn't make twitch gaming very enjoyable, and the image quality might have served only slightly above current-gen consoles, with some compression artifacts that non-cloud gamers would never encounter. But it was working. You could in fact play games with all the electronic muscle hiding away behind miles of wiring. It was an extremely novel concept, and while it didn't take off with any great zeal, it was a step towards what some might say is the future.

Why is that though? Why does cloud gaming necessarily have to be the future?

Friday, November 11, 2011

AMD's Nervous Breakdown

So Bulldozer bombed. The biggest change in CPU architecture for AMD since the K7, and their one true hope for finally making up the miles worth of lost ground towards Intel in the processor performance race. Up in smoke.

Oh, it's a disappointment alright. On paper -- and granted, I'm not a whiz at processor architectures -- it sounded pretty darn good. Sure two threads had to share a single FP unit inside one of its modules, but it could do 256-bit vector operations. General consensus is that the design of the chip, from a high level, was sound. But it hinged on something very important: clockspeed. It was a beefier engine, and it needed more cycles to keep it fed, and the end product was simply starved of those. Unless you were following all the leaked benchmarks and performance indicators leading up to its launch, you were shocked. The world was shocked.

Saturday, March 12, 2011

iPad redux

When the first iPad came about, I like much of the internet media and public rejected the idea as derivative and superfluous. I didn't see the market for an oversized iPod Touch, and didn't see Apple putting much thought into the design other than to say "it's magic". Well I like many others were dead wrong, and the thing sold like crack candy. I should know better than to doubt the success of an Apple-branded consumer device. But really, as little effort as Apple put into it, they touched on the desire for tablet computers way before anyone else, and because they were Apple, people flocked to it, eager to get a hold of the new form factor. Sometimes good timing is all you need.

So in the wake of an onslaught of competing tablets bursting at the seams for a release, we have the iPad 2, destined to continue the success of the original, again with very little effort. It's the first iPad, but with a thinner enclosure and faster hardware, and the cameras finally glued in place. In a nod to Moore's Law, it debuts at the same price of the predecessor, and from that perspective you can say they're at least not gouging people on it. But the reality is Apple's up to their old tricks of arrogance with a pinch of innovation, except in the case of the tablet market, their only selling point is that they came first. That helps them win the popularity contest, but things are going to become a great deal more cutthroat in a short amount of time.

Tuesday, January 11, 2011

ARMed for a Revolution

Back in 2009 I first wrote about ARM in a rambling I titled ARM's Ascension in which I talked about the rising aspirations and potential of ARM processors in the general computing field. A lot of what I said still rings true, and a lot of what seemed apparent in the future of the market back then is now known to not be true anymore. Smartbooks were prototyped many times, but never made it into shipping products. Instead what happened was the iPad.

iPad ended up doing exactly what many other Apple products have done in the past. When Apple entered the portable media player market, it flourished. When they entered the smartphone market, it flourished. Now that they've entered the tablet market, or maybe better said, initialized the tablet market, that market is set to flourish also. At the forefront of this new emerging form factor is ARM. No matter what SoC your product is using, be it Apple A4, Qualcomm Snapdragon, Samsung Hummingbird, or NVIDIA Tegra 2, ARM lies in the center of it. From the get-go it seems ARM has an iron grip on the market, leaving competitors, namely Intel, with a cliff face of an uphill climb if they want in on it.

Wednesday, December 15, 2010

Opening a can of...wrist slapping

I put off writing another rambling in anticipation of the upcoming Caymen GPU release, figuring nothing else was happening in the technology world worth writing about as much. Well, guess what? It released.

And it's about damn time too. About a month delayed (plus a NDA lift postponed a few days), we have the 6900 family, a sub-group of the 6000 order tailoring exclusively to the highend portion of the market. AMD can hardly be faulted for a comparatively minor delay next to NVIDIA's pitfalls, but when all is said and done, this amounts more to a swift kick to the shin than an ass whoopin'.

Monday, October 25, 2010

5870 + 1000 naming points

The next generation 6000 series is upon us, and it's another case of fact being less interesting than fiction. Rumors had the chip pegged as re-architected shader setup with a Vec4 layout consisting of more capable and versatile SP units, coupled with an uncore clock domain system ala G80 that would clock the shaders higher than the rest of the chip. They're pretty interesting ideas that make sense in light of being stuck on the previous generation 40nm manufacturing process. If you can't boost transistors, find a way to push those existing transistors harder.

Tuesday, August 31, 2010

Tech Updates

Before I commit the entire month of August to headphone rambling, here's a few things in the tech world I sort of skipped over since the last relevant post.

The biggest I think is AMD ditching the ATI name. A lot of computer websites and myself included mostly referred to ATI products by AMD anyway, but occasionally "ATI" would slip back into consciousness and find its way into our writings whenever the thought arose about what actually produces these products. Well AMD hopes to dispel any such confusion in the future by erasing any question of the source of their graphics solutions, emphasizing the dissolving of any subdivisions into a unified, singular company. To some, this is a startling reminder of the extremity of the event that took place now four years ago, one that carries with it some pretty heavy significance in the history of computing, and in the minds of those who follow the industry. Those folks might have foreseen this eventuality from the outset, but the sudden immediacy of its arrival is no less sobering especially for fans of ATI in years gone by. It seems hard to imagine a Radeon without an ATI, but we must be quick to remind ourselves that it isn't the name that made the Radeon, but the company, the engineers and innovators within that kept an underdog alive, and even gave their opposition a bloody nose from time to time. Now more than ever they're on top of their game, and with NVIDIA struggling to flesh out their product line, their clockwork execution and unwavering onslaught have put them in a position few would have thought possible after the stumbling first couple years following their acquisition.

Sunday, June 13, 2010

M11x Revisited

I had briefly mentioned the M11x once before when Alienware first announced it, noting it was an interesting product. Then shortly after its debut, NVIDIA launched Optimus, which is the leader of the Automatic power-saving faction fighting the evil forces of the Deceptively power-hungry opposition. What the power savings actually are I'm not too sure, but I think the main point is that it's about twice as convenient as the manual switching types from before, and it'll be getting all the driver support going forward. So current owners of manual switching discreet graphics laptops are pretty much screwed, and that includes early adopters of the M11x.

It was only natural that people started hoping for an updated M11x that supported the new technology, and while they're at it, updated to the new Core i-series CULV CPUs, and maybe given some tweaks to the aesthetics. But of course there was some fear that we'd have to wait until the next wave of products to see such changes, if they ever came at all. It seemed silly to release the M11x as it was, so close to the introduction of Optimus technology, when surely they must have been informed by NVIDIA ahead of time of its approach. A lot of tech companies will do that to ensure early adoption of new products. It reminds me of a time several years ago when Alienware put a lot of R&D into making a custom multi-GPU graphics system, complete with custom motherboards with multiple AGP slots and software hacks, touting it as the return of SLI. Then about a year later NVIDIA announced the actual return of SLI, coinciding with the launch of PCI Express. Alienware just has a history of doing things at just the wrong time.

Thursday, May 6, 2010

AMD Rising

So I'm sitting here with my 5870 churning away, I've gotten rid of the Catalyst Control Panel entirely, updated the drivers, noticed some bugs getting fixed, and I'm just thinking, "you know, this card is pretty damn nice." It may not have had much wow-factor for me, and it may lack some features, but the performance is damn good, the quality is damn good, and as I start to adjust to the new equipment, I'm starting to realize how happy it's making me. AMD's got a good thing going here.

Despite a loss of market share, AMD is finally in the black again, thanks to the completion of their GlobalFoundries spinoff. They've just launched their 10xxT series, finally competitive with the more enthusiast range Core i-series Intel CPUs. Their newly-launched Phenom II X4 and X3 chips for mobile platforms has just been announced in a multitude of Dell and HP notebooks, for the first time bringing them out of the long-held Turion slump of the last several years. All this while they're steadily gaining market share in the world of graphics, with the only top-to-bottom DX11 lineup available and the only mobile DX11 solutions in existence.

Monday, April 12, 2010

My ATI

Since the launch of the 5870, I was never really impressed by it. Set with the simple goal of doubling everything in the previous flagship, architecturally it left little for the technology enthusiast to scrutinize, taking what we already knew and just expanding it. This resulted in performance bound by limitations instilled in the design of a graphics pipeline dating back several years, and not a particularly forward-looking one even then. Certainly some tweaks were made, mainly to do with caches and registers and low-level things of that nature, but the overall layout remained, itself strictly a product of a general analysis of current-day graphics engines.

Weathering all expectations was the realization of a meager 40% gain over the 4890, a number that's since ballooned to about 58% average under a more modern selection of games and drivers. Clearly a card that was more a victim of the times than anything else; a ceiling imposed by the limited demands of contemporary software.

Monday, April 5, 2010

Taiwan Semi-(bad)Conduct

NVIDIA has a problem. It's got a bad chip launch on its hands: it's hot and uses a lot of power, it's huge and yields suck, and it hasn't nearly hit performance targets. Worst of all, they're stuck with that for a whole year before the option for a die shrink shows up to fix all their woes.

TSMC has been promising that 28nm would be well ready by this year. But as reported by The Inq and a very good news article by Xbit Labs, they might barely make it out by the end of this year, with GPU-suitable versions only being ready by early next year. Also mentioned in the Xbit article (and here as well), 32nm has been scrapped, so that leaves NVIDIA no other alternative but to wait, and thus so must we.

Friday, April 2, 2010

FX Reincarnated?

I held off writing another blog post for a couple months just waiting for Fermi. I didn't have much else to write about, and as far as tech-related stuff goes, Fermi was the biggest thing on my mind. I've already said pretty much everything else I'd ever want to say about it, and there wouldn't be anything new to comment on until it finally released. (I did think about writing a rambling about Bioshock 2 but I didn't have anything to say about that that hadn't already been said elsewhere.) Since its release, it was just a matter of setting aside the time to do it.

The last six months have been agonizing. I remember the day the 5870 came out. I had WoW running in the background and a webpage sitting there in front of me with a NewEgg listing of all the 5870s available, all of them $380. I had to will myself not to pull the trigger on one, because NVIDIA might have something much better right around the corner, and it might be stupid not to at least wait and see. Usually in the tech world that's always the best policy, but this is one of the few times I'm kicking myself for not indulging in some impulse buying.

Thursday, January 28, 2010

iPass

There's a lot to love about the tablet concept. People are moving towards smaller computers, and that means desktops are getting replaced by notebooks. And since notebooks are hot, bulky, and lose their charge real quick, they must be replaced with something that's easier to tote around. Netbooks are much easier to carry, and have good battery life, but you still need a place to put them when in use. Then you have smartphones, but generally they're too small to get any real work done.

So then the concept of the tablet comes in. Let me first say that I'm not talking about those laptops with swiveling touchscreens...hell no. I'm talking about the convergence of the strengths of smartphones and laptops into one device that's as easy to carry around the house as it is to carry around world. Anand describes the idea well. It's a Star Trek-like device (as he puts it) built for a totally new and emerging usage model. Like him, it's the sort of thing I've been waiting a long time for. So when Apple announced the iPad, my interest was piqued.

Friday, January 8, 2010

Updates, updates...

So CES is this week. Palm launched new stuff. Intel launched new stuff. AMD launched new stuff. More importantly, NVIDIA launched new stuff.

NVIDIA has succeeded again in releasing another SoC that everyone wants. Hopefully they succeed this time at actually delivering it to everyone who wants one. Last time Tegra's only notable design win was the Zune HD, a largely forgettable media player that...well, everyone largely forgot about shortly after its release. But that was all it had. Earlier at the start of this blog I had gushed at the possibilities of its use in smartbooks, only to be disappointed at the close of the year by the absence of said smartbooks. Turns out Mobinnova (and others) was simply waiting for Tegra 2, and for good reason. Packing two out-of-order dual-issue FPU-enabled ARM Cortex A9s, it beats the shit out of Tegra 1. Every demo of a tablet (I guess some are calling those "slate PCs" now) or smartbook using Tegra showed a sluggish running system. The thing was simply not meant for full-sized computing endeavours, and let's face it, we're not even talking full-sized demands here. But Tegra 2 should have no problem handling any Firefox-browsing aspirations, and hell even some HD media and gaming on the side. Cooler still, it's built on 40nm. Usually side products like this--chipsets, bridge chips, NVIO, whatever else NVIDIA makes that's not a GPU--get second class manufacturing, but not this time. I guess it's a sign NVIDIA's really taking this seriously, and if worst comes to worst, I think they're banking on supporting themselves on this little "side product" if at all possible. Apparently they see the mobile SoC market as being worth billions, overshadowing any other market they've ever been in, so it could very well be the next big thing for them. Well, the only other big thing for them aside from GPUs. For now let's hope Tegra 2 makes it into some kickass products that we can actually buy.

Wednesday, November 18, 2009

Graphics is my favorite subject

So the HD 5970 is out. I like the name. No suffixes whatsoever. Simple, clean, elegant, gets the point across. There's a prefix but that's just to denote the boarder range of GPUs it's a part of. Better than NVIDIA's prefixes, which are really suffixes just moved to the front.

I read a few reviews, and obviously the thing pulverizes the competition, but the competition is a dead horse anyway. Something frustrated me about most of the reviews though: the game selection. It can't be helped, I suppose. Almost all of them are console ports (some with minor enhancements) that never had any problem running in the first place. What's the point benching those games if absolutely no one would be basing their purchasing decision on them? Nobody's thinking "oh man, I need to start doing research for a card that can play Borderlands". Fucking anything can play Borderlands. $100 cards used to be shit but now that'll buy you a 9800GT or equivalent. That's like nothing for a video card budget, and we're talking a card just a smidge under the flagship performance of 2006 (which would normally make it pretty old, but not anymore). So yeah, anything north of toilet paper will run Borderlands, or any of the COD games... or Far Cry 2, or L4D2, or Resident Evil 5, or Batman: Arkham Asylum, or whatever the hell else.

Friday, October 23, 2009

ARM's Ascension

NVIDIA hopes to grow their Tegra business to eventually make up 50% of their revenue. By scoring a win with the Zune HD, possibly ending up in the future Nintendo handheld and Apple products, and countless other media, phone, and computing devices, it's no wonder why their expectations might be high. SoCs have always been very popular in the ultra-portable scene, and Tegra is among many leading the way for the future of this technology sector. With hardware accelerated flash, video, graphics and audio support, the capabilities of such SoCs has grown to the point of surpassing the form-factor of just smartphones, to encompass a vast array of devices extending all the way up to notebook-like devices, dubbed "smartbooks".

It's for this reason that ARM is becoming better positioned to take the computing world by storm in the near future. With their recent partnership with the newly formed GlobalFoundries manufacturing company, it's clear they intend on increasing the capabilities of their chips beyond the scope of what they're best known for today.

Saturday, October 3, 2009

NVIDIA's Fermi

Fermi is the name of the architecture for NVIDIA's next gen (DX11) cards. Fermi was announced ahead of actual card announcements or even just information about gaming features. All that was talked about, in fact, was Tesla-related shit, but despite that I've read all kinds of bullcrap from people jumping to all kinds of ridiculous conclusions about it.

Once again, this was an announcement for Tesla. Companies looking to make large investments in new servers and HPC systems need a lot of lead time to make decisions, and NVIDIA was trying to appeal to them, as well as investors and stock holders, proving that Fermi is real and that there are some really cool things to look forward to about it. AMD released their shit, so now NVIDIA wants to make some sort of response, even if it isn't actual hardware. This was an announcement to gain mindshare, nothing more.

Monday, September 21, 2009

NVIDIA PhysX

PhysX has always been met with skepticism from all sides of the gaming industry. From its roots as a $250 add-in card to its current incarnation as the forefront of general purpose GPU processing for use in games, the value proposition has been one that few have been able to grasp despite all the promises and possibilities.

Monday, July 27, 2009

On the precipice of battle, DX11 approaches

Sheesh, last time I complained it had been ten days since the previous rambling. Now here I am damn near a month later before I'm updating again.

It's almost August. Windows 7 has gone gold. In four months, on October 22 of this year, it'll release to the hungry masses, bringing with it the next DirectX generation, DirectX 11. To correspond with the new API update, new graphics card generations will be released, as they've always done. Since Intel is still at least a year off from debuting the hard launch of Larrabee, that leaves two graphics card companies to think about: AMD and NVIDIA. It has been a long time since the last generation first launched. Both GPU designers, the public, and the press are absolutely aching for the next battle.