The Original Logitech G15 Gaming Keyboard (2005)
Don't get me wrong, I'm a fan of Logitech's newest G15 gaming keyboard.
It's the original version that left something to be desired, as the black paint on the back-lit keys quickly wore out after only moderate use. The only reason you can read the keys with white letters in this picture is because they're covered by stickers I bought on eBay, and they block out the illuminated effect that sold me on the product in the first place. If Logitech sold replacement keys, I would have bought them. But it's the company's policy not to. While it's true that Logitech sent out replacements to anyone who complained early on, I bought later, missed the RMA window, and was left with a shoddy-looking jumble of illegible keys.
ATI Radeon HD 2900 XT (2007)
Unfortunately, Nvidia's GeForce FX 5800 wasn't the only graphics card that fell flat on its face. In isn't the only poorly received graphics card ever released. In 2007, ATI struggled with its Radeon HD 2900 XT (Ed.: I'd even argue that there were more epic failures than these two choices. Remember S3's Savage 2000 with its broken T&L engine?)
ATI's flagship DirectX 10 card was slower than Nvidia's GeForce 8800 GTX, and it even succumbed to the more mid-range GeForce 8800 GT. To make matters worse, the Radeon was power-hungry and loud, too.
ATI managed to tweak its VLIW architechture into better products that emerged as the Radeon HD 3800 family. But its Radeon HD 2900 XT is still remembered as a letdown by graphics card enthusiasts.
Ageia PhysX Card (2006)
When the Ageia PhysX hardware accelerator card launched in 2006, it sold for $300. After paying that painfully-high price, there were a handful of pretty weak physics effects made available in an even smaller line-up of games. Fair enough, right? You bought the card thinking it'd be even more useful in the future.
Nvidia bought Ageia in 2008. By 2009, the newest PhysX drivers disabled the technology if a Radeon graphics card was detected. In other words, your $300 Aegia PhysX card was useless unless it ran next to one a GeForce board. To add insult to injury, Nvidia's own GPUs were already accelerating the middleware SDK by that point. All of this went down right around the time the first interesting PhysX-enabled titles were emerging, too (Batman: Arkham Asylum and Mirror's Edge).
In the end, everyone who paid for the original physics processing unit were hung out to dry. Nvidia dropped driver support for boards based on Ageia's PPU entirely in 2010.
AMD Phenom CPU (2007)
The first Phenom CPU promised to be the world's first quad-core chip on a single die, but it performed poorly compared to Intel's Core 2 on release. Its introduction was also marred by a translation lookaside buffer (TLB) bug discovered right after launch that could cause a crash under certain conditions. Most motherboard vendors offered a BIOS workaround to circumvent the issue, but it imposed up to a 10% performance hit.
AMD fixed the bug in hardware with its B3 stepping, but the CPU design was still never able to challenge Intel's Core 2-based chips. AMD didn't have a competitive product until the company released its Phenom II family in 2008. The original Phenoms were quickly phased out.
Killer Gaming Network Cards (2009)
Killer's line of gamer-oriented network cards include impressive hardware specifications, including a Freescale-based network processing unit, on-board RAM, and a software suite responsible for extensive customization of network settings. None of that improves actual game performance, as far as our testing has shown, though.
At best, the company's network management software can be commended for prioritizing gaming traffic over other processes, which may improve latency if you're competing online while feeding a peer-to-peer network. But wouldn't it make more sense to just pause those downloads before you fire up your favorite shooter? That sure would have beaten paying almost $300 for Killer's technology when it first emerged.
AMD's FX CPU (2011)
That's the Zambezi-based FX, not the Sledgehammer-based one, which we actually rocked back in the day.
We waited a long time for AMD's next-generation CPU architecture, and, on paper, it looked great. Unfortunately, in its initial incarnation, the expected performance isn't there, power consumption is super-high, and pricing isn't even all that compelling.
AMD managed to improve some aspects of the new CPU over the Phenom II generation. The buzz from Microsoft is that Windows 8 will handle the Bulldozer module concept more elegantly, and we're still getting promises from AMD that its follow-on to Bulldozer, code-named Piledriver, will introduce a number of fixes. For now, though, we can't help but think this isn't the way AMD's architects envisioned this situation going down.