Skip to main content

AMD Releasing Radeon HD 6950 1GB for $279?

Although AMD has yet to release an official press release, X-bit Labs claims that the company plans to release a Radeon HD 6950 1 GB graphics card for $279 here in the States next month. The card will be AMD's answer to Nvidia's GeForce GTX 560 Ti set to make its debut next week which will also offer a similar price point.

Backing up X-bit's claim, last week another report indicated that Hightech Information System (HIS) would be one of the first manufacturers to launch a Radeon HD 6950 1 GB graphics card. Adding "Fan Edition" to the name, the discreet card will offer a (stock) Cayman GPU clocked at 800 MHz and 5000 MHz GDDR5 memory. It will also come with CrossfireX and Eyefinity support, two DVI ports, one HDMI port, and two mini-DisplayPort connector. The launch date and pricing were not provided, however it was speculated to cost below $300 USD.

Gigebyte and Sapphire have also revealed their cards ahead of schedule (thanks to HIS it seems), however the Gigabyte model codenamed GV-R695OC-1GD will sport an Ultra Durable design (2oz copper PCB, solid capacitors), the three-fan WindForce 3X cooler and a GPU overclocked to 870 MHz. PowerColor and XFX are also lined up to offer their versions when the Radeon HD 6950 1 GB supposedly goes official in mid-February.

Given that we have a couple of weeks before AMD makes the Radeon HD 6950 1 GB card official, the company may decide to lower the pricetag even more. As it stands now, the upcoming 1 GB model will only be $20 cheaper than the 2 GB model already on the market ($299). Although consumers would save a few Hamiltons, would two extra $10 bills in your pocket be worth the performance hit due to halving the memory? Probably not, especially for hi-resolution PC gaming.

  • alextheblue
    Yeah but if it gets down to the $250 mark it might be worth it. We'll see what happens.
    Reply
  • sseyler
    alextheblueYeah but if it gets down to the $250 mark it might be worth it. We'll see what happens.
    True. The HD 5870 with a custom cooler for low $200s is the best deal right now.
    Reply
  • liquidsnake718
    this is similar to the 5850 or 5750 in preformance?
    Reply
  • aznguy0028
    liquidsnake718this is similar to the 5850 or 5750 in preformance?It's faster. The 6870's performance was between the 5850 and 5870.
    Reply
  • JohnnyLucky
    Hmmm....I wonder how reducing the amount of memory will affect performance.
    Reply
  • 4745454b
    Not much which is why the 1GB card is supposed to sell for only $20 less. Most of us are still on 1680x1050 or 1920x1080(or 1200) screens so the 1GB less of Vram means nothing to us. I'd only worry about having 2GBs of Vram if you are on an Eyefinity type setup. Then it will matter. The 5870 for low to mid $200s is a good deal, but they should be drying up.

    I'm not sure I would have gone this route. I probably would have just price adjusted the 6870 unless its performance is so much behind the GTX560.
    Reply
  • sykozis
    With games using more and more vram....that 1GB card is going to suffer in the not so distant future. World of Warcraft can already use upwards of 850mb of vram according to MSI Afterburner. I would expect this to increase as graphics updates are released... 1GB of vram is quickly becoming a minimum....anything less really limits graphics detail, especially at higher resolutions (1920x1080 and higher).
    Reply
  • joelmartinez
    as said previously if this card hits 250 will have a good one, as is a 6950 2gb would be a better choice, unless gaming below 1920x1080
    Reply
  • DXRick
    Given that the 5870 2G version is $100 more than the 1G version, and the 6950 is selling for $300 on Newegg, we can extrapolate the price of the 6950 1G at around $229 (factoring the coeffecient of the price difference and multiplying that by Pi).
    Reply
  • rohitbaran
    Well, I read someplace else that it would be Radeon 6970 that would be launched with 1 GB RAM and at a price tag of $279. It seemed too good to be true. Looks like it was too good to be true.
    Reply