Wednesday, December 15, 2010

Opening a can of...wrist slapping

I put off writing another rambling in anticipation of the upcoming Caymen GPU release, figuring nothing else was happening in the technology world worth writing about as much. Well, guess what? It released.

And it's about damn time too. About a month delayed (plus a NDA lift postponed a few days), we have the 6900 family, a sub-group of the 6000 order tailoring exclusively to the highend portion of the market. AMD can hardly be faulted for a comparatively minor delay next to NVIDIA's pitfalls, but when all is said and done, this amounts more to a swift kick to the shin than an ass whoopin'.

Tuesday, November 16, 2010

Bobcat versus the caterpillar

Previews are out for the new Bobcat platform from AMD, technically called the Brazos platform, but Bobcat is the architecture powering it. It sorta coincides with the Bulldozer architecture powering the Scorpius platform, although in the real world, Bobcat makes bulldozers, not the other way around.




Monday, October 25, 2010

5870 + 1000 naming points

The next generation 6000 series is upon us, and it's another case of fact being less interesting than fiction. Rumors had the chip pegged as re-architected shader setup with a Vec4 layout consisting of more capable and versatile SP units, coupled with an uncore clock domain system ala G80 that would clock the shaders higher than the rest of the chip. They're pretty interesting ideas that make sense in light of being stuck on the previous generation 40nm manufacturing process. If you can't boost transistors, find a way to push those existing transistors harder.

Tuesday, August 31, 2010

Tech Updates

Before I commit the entire month of August to headphone rambling, here's a few things in the tech world I sort of skipped over since the last relevant post.

The biggest I think is AMD ditching the ATI name. A lot of computer websites and myself included mostly referred to ATI products by AMD anyway, but occasionally "ATI" would slip back into consciousness and find its way into our writings whenever the thought arose about what actually produces these products. Well AMD hopes to dispel any such confusion in the future by erasing any question of the source of their graphics solutions, emphasizing the dissolving of any subdivisions into a unified, singular company. To some, this is a startling reminder of the extremity of the event that took place now four years ago, one that carries with it some pretty heavy significance in the history of computing, and in the minds of those who follow the industry. Those folks might have foreseen this eventuality from the outset, but the sudden immediacy of its arrival is no less sobering especially for fans of ATI in years gone by. It seems hard to imagine a Radeon without an ATI, but we must be quick to remind ourselves that it isn't the name that made the Radeon, but the company, the engineers and innovators within that kept an underdog alive, and even gave their opposition a bloody nose from time to time. Now more than ever they're on top of their game, and with NVIDIA struggling to flesh out their product line, their clockwork execution and unwavering onslaught have put them in a position few would have thought possible after the stumbling first couple years following their acquisition.

Tuesday, August 24, 2010

Iso-orthodynamagnaplanarstats

One of the great things about headphones and any audio endeavor is the human ear. Well, the human perception of sound, I should say. A human's aural memory is quite fickle and volatile, especially when it comes to minute details and differences. You can barely remember what you heard one minute to the next, which can make comparisons between audio equipment a pain in the ass, but it has benefits when you have a really good audio setup. When I bought a new LCD, going from a TN panel type to an IPS type, I could immediately tell the difference, and it was striking. I could easily recall how the old screen looked, and even given the amount of time it took to set up the new screen, I had no trouble distinguishing all the improvements it made. This same ability to remember all things visual also meant that as time went on, a short amount of time actually, I got used to the IPS screen, and it was no longer breathtaking to look at. The brilliant colors, contrast, and image stability just became the norm, and the thrill was gone before a month had passed.

With audio, you can't remember shit. In the time it takes you to unplug one pair of headphones and plug in the next, which is less than a minute, you've already forgotten all the little characteristics of the sound enough to not be able to pinpoint the differences between the two, unless it truly is night-and-day (and we're talking a vast expanse between them). This is in sharp contrast to other animals, like birds, who use acute audio memory to discern the subtle differences in the sound of their mate's call from all the other birds in the flock (and in cases of domestic birds, sometimes the ability to imitate sounds with high accuracy). For this reason, every time you play music through your speakers, you're rediscovering them, and even after a long time, they can still shock and amaze you, and thrill you all over again. It's one of the things that makes audio hobbies so much fun, and why they have such a following, while computer monitors and TVs....don't.

Wednesday, August 11, 2010

They can take me anywhere I want

Of the many earthly possessions I fill my life with to inject into it some level of enjoyment, headphones stand out as one of the most expensive, and possibly most controversial. Early on headphones were something my dad and I shared when we'd make long trips away from home, visiting relatives for what would be weeks of pushing on our threshold of boredom. We'd find a moment to slip out from their company and make a stop by a department store of some sort, what substituted at the time for a fun getaway. Often we found ourselves looking at headphones, sometimes out of a need for a new set or just as a passing interest. We'd pick up a pair or two and bring them home, eagerly anticipating what the fruits our discoveries might bring. Armed with portable CD players (remember those?), we'd compare and contrast between old and new, different models, different styles. "This has more bass," I'd comment, "but that has more loudness." My father might concur, adding, "you can hear the guitar strings better with that one." It would be a fun moment sitting on the couch, trading opinions, enjoying music, and bonding. Unfortunately my father and I don't share a lot of interests, and such occasions are dear in my memory.

So it seemed only natural that my interests in headphones would grow beyond the sparse excursions away from home, into something all its own, a fascination into the quality of audio derived from headsets of unknown characteristics, a sort of treasure hunt within the vast array of choices hanging from the walls and racks of various retailers. Those were all fairly inexpensive though, and it wasn't long before I found myself unsatisfied with what was available at those prices. I wanted to know what lay beyond, what you could really achieve with the headphone form factor. That inevitably led me to the internet, and searches there uncovered a vast wealth of brands and price ranges, and huge communities sprung up around them. I soon came to know the name Sennhesier, and soon after I spent some hard-saved money on a model costing nearly $100. Quite an expensive initial entry into the world of boutique audio headgear, considering my then fledgling enthusiasm. Upon their arrival, I knew instantly there were more to headphones than I had ever known, and my fascination was greatly spurred.

Sunday, June 13, 2010

M11x Revisited

I had briefly mentioned the M11x once before when Alienware first announced it, noting it was an interesting product. Then shortly after its debut, NVIDIA launched Optimus, which is the leader of the Automatic power-saving faction fighting the evil forces of the Deceptively power-hungry opposition. What the power savings actually are I'm not too sure, but I think the main point is that it's about twice as convenient as the manual switching types from before, and it'll be getting all the driver support going forward. So current owners of manual switching discreet graphics laptops are pretty much screwed, and that includes early adopters of the M11x.

It was only natural that people started hoping for an updated M11x that supported the new technology, and while they're at it, updated to the new Core i-series CULV CPUs, and maybe given some tweaks to the aesthetics. But of course there was some fear that we'd have to wait until the next wave of products to see such changes, if they ever came at all. It seemed silly to release the M11x as it was, so close to the introduction of Optimus technology, when surely they must have been informed by NVIDIA ahead of time of its approach. A lot of tech companies will do that to ensure early adoption of new products. It reminds me of a time several years ago when Alienware put a lot of R&D into making a custom multi-GPU graphics system, complete with custom motherboards with multiple AGP slots and software hacks, touting it as the return of SLI. Then about a year later NVIDIA announced the actual return of SLI, coinciding with the launch of PCI Express. Alienware just has a history of doing things at just the wrong time.

Tuesday, June 1, 2010

Those IPS Reports

Remember when LCDs first came out? They all had like 16ms response times, they all topped out at 1280x1024, they all cost like $500...and they looked like crap. That was when they first started hitting it big, sort of like where solid state disks are right now, and for some people it was enough that they were flat and didn't take up a quarter of your desk, or produce as much heat as your tower. For most people, the fact that you couldn't (or perhaps just shouldn't) change resolutions, there was a crap-ton of ghosting, and the colors and viewing angles sucked for like twice the price of a more capable CRT was enough to deter any thoughts of early adoption.

Funny enough most of those disadvantages are still there, but LCDs have caught on nonetheless. Manufacturers must have seen the potential in it because despite the lack of initial demand, they continued investments and advancements in the technology, and eventually we got to a point where you could have high res LCDs for as much or cheaper than CRTs (right before they completely disappeared from the face of the planet). The contrast, colors, and response time were good enough that people were satisfied with them once they factored in the other advantages LCDs have. And at least now they finally found an aspect ratio that makes sense.

Thursday, May 6, 2010

AMD Rising

So I'm sitting here with my 5870 churning away, I've gotten rid of the Catalyst Control Panel entirely, updated the drivers, noticed some bugs getting fixed, and I'm just thinking, "you know, this card is pretty damn nice." It may not have had much wow-factor for me, and it may lack some features, but the performance is damn good, the quality is damn good, and as I start to adjust to the new equipment, I'm starting to realize how happy it's making me. AMD's got a good thing going here.

Despite a loss of market share, AMD is finally in the black again, thanks to the completion of their GlobalFoundries spinoff. They've just launched their 10xxT series, finally competitive with the more enthusiast range Core i-series Intel CPUs. Their newly-launched Phenom II X4 and X3 chips for mobile platforms has just been announced in a multitude of Dell and HP notebooks, for the first time bringing them out of the long-held Turion slump of the last several years. All this while they're steadily gaining market share in the world of graphics, with the only top-to-bottom DX11 lineup available and the only mobile DX11 solutions in existence.

Monday, April 12, 2010

My ATI

Since the launch of the 5870, I was never really impressed by it. Set with the simple goal of doubling everything in the previous flagship, architecturally it left little for the technology enthusiast to scrutinize, taking what we already knew and just expanding it. This resulted in performance bound by limitations instilled in the design of a graphics pipeline dating back several years, and not a particularly forward-looking one even then. Certainly some tweaks were made, mainly to do with caches and registers and low-level things of that nature, but the overall layout remained, itself strictly a product of a general analysis of current-day graphics engines.

Weathering all expectations was the realization of a meager 40% gain over the 4890, a number that's since ballooned to about 58% average under a more modern selection of games and drivers. Clearly a card that was more a victim of the times than anything else; a ceiling imposed by the limited demands of contemporary software.

Monday, April 5, 2010

Taiwan Semi-(bad)Conduct

NVIDIA has a problem. It's got a bad chip launch on its hands: it's hot and uses a lot of power, it's huge and yields suck, and it hasn't nearly hit performance targets. Worst of all, they're stuck with that for a whole year before the option for a die shrink shows up to fix all their woes.

TSMC has been promising that 28nm would be well ready by this year. But as reported by The Inq and a very good news article by Xbit Labs, they might barely make it out by the end of this year, with GPU-suitable versions only being ready by early next year. Also mentioned in the Xbit article (and here as well), 32nm has been scrapped, so that leaves NVIDIA no other alternative but to wait, and thus so must we.

Friday, April 2, 2010

FX Reincarnated?

I held off writing another blog post for a couple months just waiting for Fermi. I didn't have much else to write about, and as far as tech-related stuff goes, Fermi was the biggest thing on my mind. I've already said pretty much everything else I'd ever want to say about it, and there wouldn't be anything new to comment on until it finally released. (I did think about writing a rambling about Bioshock 2 but I didn't have anything to say about that that hadn't already been said elsewhere.) Since its release, it was just a matter of setting aside the time to do it.

The last six months have been agonizing. I remember the day the 5870 came out. I had WoW running in the background and a webpage sitting there in front of me with a NewEgg listing of all the 5870s available, all of them $380. I had to will myself not to pull the trigger on one, because NVIDIA might have something much better right around the corner, and it might be stupid not to at least wait and see. Usually in the tech world that's always the best policy, but this is one of the few times I'm kicking myself for not indulging in some impulse buying.

Thursday, January 28, 2010

iPass

There's a lot to love about the tablet concept. People are moving towards smaller computers, and that means desktops are getting replaced by notebooks. And since notebooks are hot, bulky, and lose their charge real quick, they must be replaced with something that's easier to tote around. Netbooks are much easier to carry, and have good battery life, but you still need a place to put them when in use. Then you have smartphones, but generally they're too small to get any real work done.

So then the concept of the tablet comes in. Let me first say that I'm not talking about those laptops with swiveling touchscreens...hell no. I'm talking about the convergence of the strengths of smartphones and laptops into one device that's as easy to carry around the house as it is to carry around world. Anand describes the idea well. It's a Star Trek-like device (as he puts it) built for a totally new and emerging usage model. Like him, it's the sort of thing I've been waiting a long time for. So when Apple announced the iPad, my interest was piqued.

Friday, January 8, 2010

Updates, updates...

So CES is this week. Palm launched new stuff. Intel launched new stuff. AMD launched new stuff. More importantly, NVIDIA launched new stuff.

NVIDIA has succeeded again in releasing another SoC that everyone wants. Hopefully they succeed this time at actually delivering it to everyone who wants one. Last time Tegra's only notable design win was the Zune HD, a largely forgettable media player that...well, everyone largely forgot about shortly after its release. But that was all it had. Earlier at the start of this blog I had gushed at the possibilities of its use in smartbooks, only to be disappointed at the close of the year by the absence of said smartbooks. Turns out Mobinnova (and others) was simply waiting for Tegra 2, and for good reason. Packing two out-of-order dual-issue FPU-enabled ARM Cortex A9s, it beats the shit out of Tegra 1. Every demo of a tablet (I guess some are calling those "slate PCs" now) or smartbook using Tegra showed a sluggish running system. The thing was simply not meant for full-sized computing endeavours, and let's face it, we're not even talking full-sized demands here. But Tegra 2 should have no problem handling any Firefox-browsing aspirations, and hell even some HD media and gaming on the side. Cooler still, it's built on 40nm. Usually side products like this--chipsets, bridge chips, NVIO, whatever else NVIDIA makes that's not a GPU--get second class manufacturing, but not this time. I guess it's a sign NVIDIA's really taking this seriously, and if worst comes to worst, I think they're banking on supporting themselves on this little "side product" if at all possible. Apparently they see the mobile SoC market as being worth billions, overshadowing any other market they've ever been in, so it could very well be the next big thing for them. Well, the only other big thing for them aside from GPUs. For now let's hope Tegra 2 makes it into some kickass products that we can actually buy.