GTX 295 Preview

Heeyaheeya-wooohaheeya... it's unexpected preview time ladies and gentlemen. It's been a while since ATI introduced their Radeon 4870 X2 graphics card, and well... NVIDIA just didn't have a product around to counteract that steamy X2.

Today is the day that we can lift up a big chunk of mystery that you guys have heard about in the rumor channel for a while now. It's the GeForce GTX 295. As usual, the rumors were pretty accurate. The GeForce GTX 295 will be an all new dual-GPU graphics card from NVIDIA based on the 55nm GeForce GTX 200 derivatives.

Lets browse to the next page, but not before I show you a glimpse of what we are testing today...,2107.html

Nvidia really hasn’t done much about the Radeon threat in 2008, save updating the GeForce GTX 260 to compete more aggressively against AMD’s 1 GB HD 4870. According to the company, all GTX 260s will employ the 216-shader processor configuration in the future. The 192-SP card will disappear as the channel exhausts the remaining inventory. Given a similar price point, Nvidia expects that the new GTX 260 standard will be enough to usurp fascination over AMD’s current value.

But more pertinent to today’s piece, Nvidia wants its single-card performance crown back and has invested substantial effort in making sure that happens. What we have here is a preview of hardware Nvidia plans to launch during CES. The GeForce GTX 295 is Nvidia’s answer to AMD’s Radeon HD 4870 X2, and it employs a similar construction as the company’s GeForce 9800 GX2.
95 answers Last reply
More about preview
  1. I was just gonna post :d u got me in to it :D
    Its a very good card,performs better than HD 4870X2 in most games(but IMO the difference isnt very much )

    Here is another Preview too
  2. Im thinking after dining on steak for months on a 4870x2, people are most likely too full for a sandwich this late at dinner.

    Just as I expected, nothing new or great, hurried, slapped together to fend off the inevitable. Going smaller and using multi di, isnt that ATIs claim to fame? At least its another option for competition on the highend, which is always welcome
  3. meh!

    After my experiences with Quad sli...No thank you:D
  4. better late than never, right jaydee? i mean they have shrunk the die, and it does perform better... im sure this will cause alot of price drops [thank god]
  5. that sounds awful biased there jaydee. i dont think one x2 solution is any wiser than the other. i know someone is going to spout " i never had trouble with mine" but by and large that isnt the norm.
    its a stupid idea lol. i dont like any x2 solution i have had my hands on from either company. i lucked into a 4870x2 shortly after release. got it cheap because he couldnt power it, played with it for a few days then sold it for a meager profit. fast yes, hot yes, impressed kind of. just dont like the implementations of them right now.
  6. L1qu1d said:

    After my experiences with Quad sli...No thank you:D

    this is where ATI deserves much due credit. they do a good job supporting quad fire. Nvidia supported quad SLI for s**t. the abandoning of that support for the 9800x2 owners will hurt them in trying to sell these even IF they got it right.
  7. don't get me wrong, dual GPU cards are amazing, but adding more than one is most get around 10-20% increase.

    I sill think tri 280s and dual 4870 X2s will top this card in quad.

    I would buy a dual gpu card to save room, or if I had a single Slot motherboard:)

    No more SLI for me. And yes your right, Ati over came the immature drivers much quicker than the 9800 GX2 did for instance.

    But as I saw from the previews, the 295 GTX isn't much faster than 1 280 GTX (and def not much faster than the 4870 X2). its min frame rates still had some terrible drops, which is a patern than I mostly see in dual GPU cards, and not in SLI for some reason.
  8. releases like this is good for everybody. brings most of the high-end to mainstream prices down a bit.
  9. Never say never, L1quid, Multi-Gpu /Sli may not be ready for prime time now, but it is the wave of the future, with DirectX11 and other improvements coming.
  10. Exactly^

    Judging by my quick calculations, in a perfect world, where more games scale liek COD 4+ 5

    We have

    295 GTX @ 90 fps
    280 GTX @ 60 fps

    Now again, in a perfect world lets say scaling is 100%

    so two 295 GTXs will do 180 fps
    and 3 280 GTXs will do 180 fps

    OFC give or take, given both Quad and tri scale....

    Really this card won't be a fair trade off for a number of users.

    I mean some1 with a 8800 GT sli setup pretty much is faster than 1 280 GTX...

    Again a card that just isn't bringing to the table what it's supposed...way too late...this card should've been out with the 4870 X2

    To release it about 6 months later...and expect praise is stupid! Unless it was single GPu, then I'd be quick on that even though it mean going from tri to pretty much dual power...atleast its 1 GPu.

    It looks like who ever invested int he 4870 X2 ...its still paying off.
  11. Its a joke fellas. I personally dont like the sandwich design, and its a nightmare for WC.
    Id like to point here
    for some open and honest previewing. Love this site, and like their attitude towards benching overall. The minimum fps for this card sucks, all due to drivers Im hoping.
    Great to have some competiton, means prices will be good for us, as Im sure ATI has gotten their monies worth with the x2, and itll come down in price
  12. Wake me up when a new single GPU card is out:D
  13. You guys remember the tri-sli/quadfire review Toms did when the i7 came out.. Previously adding the 3rd card more often than not decreased performence... The new CPU allows the tri-sli to scale beyond anything before it.. I think it will be the same thing for 2 of these for quad-sli.. You will need i7 to really let it stretch its legs..

    Link to the multigpu i7 article.,2061.html
  14. Time will tell. It depends how well nVidia has optimized these drivers for this card.
  15. ATI/AMD were never truly leading, that was an illusion dreamt up by the AMD fankids. The Geforce GTX 295 absolutely CRUSHES!!!! AMD's weak 4870 X2 by quite some margin. This new geforce GTX 295 will go quite nicely with my Core I7 system.

    AMD continue on this path of being a laughing stock by releasing the Phenom II that doesn't seem to be able to keep pace with a Q6600. AMD spider = sucky combination .
  16. Rofl bring in the fans:D

    give it up, its 6 months late with pretty much the same frames and more heat exhaust:D

    This card will go nicely with your A/ me it will need it:D
  17. troll stroll?
  18. sheesh. how can you not be impressed with the preliminary numbers rolling out for the Phenom 2? i havent even seen the head to heads and already ordered a board lol
  19. Be an open minded fanboy atleast.

    There is a mainstream market too you know. Not every1 can afford to buy a CPU+ video card instead of a car:D

    Now tell me, what does this card actually crush other than your wallet? Its pretty much the same as the 4870 x2.

    I'll agree with Jaydee, the sandwhich of 2 PCBs is really lazy, atleast put some effort in it:)

    Good day troll:D

    Not directed @ jaydee and roofus:) Just incase you don't get that either:D
  20. yea i hate the sandwich idea. some guys have great success with them but look in any tech forum and bad experiences outweigh the good.
  21. Wonder what if those performed as well as Intel's Double Cheeseburguer.
  22. Jaydee, thanks for the link u gave,i have a question,why the Average FPS of GTX 295 is high in that link but the Min FPS is sometimes too low,even lower than its competitors ?
  23. It looks to be just drivers, as this is a beta being used for this card.
  24. Meh, not interested in multiple GPUs until microstuttering is COMPLETELY solved. Until then, i'm more interested in single GPU performance increases.

    It will be nice to see the gtx 280 drop in price. It would be nice to see a 55nm 280 gtx too.
  25. If youve followed the drivers and their improvements on all x2 cards, ATI or nVidia, the min fps suffers the most. My guess is that unlike a true sli/CF scenario, it being a little faster (communication between cores), it causes some problems in cpu gpu performance. So, I think the drivers have to be tweaked even moreso on these cards. Its part of the AFR setup, the way its done, plus other problems.
  26. I don't agree that SLI/CF are the future, though. I think the design the HD 4870X2 used is, but with a wider/faster sideport that's actually used, and both GPUs working together much more closely. SLI/CF on two boards, whether in one slot or not will never come close to the scaling that one board + massive bandwidth can provide (estimated), and solving micro-stuttering lies in the GPUs working together much more closely.
  27. We'll have to see how DX11, and other arch changes in the future works. Multi is the future, just like cpus, its implementation for now thats showing its age, it has to change
  28. The chart didn't seem to have the power requirements for the GTX 295 :/

    What I am really curious about is, do you think I will be able to safely run it on my power supply? I have an OCZ 600w StealthXstream (4x18a rails). I am currently running a Q9450 (2.66ghz Quad-Core), 8gb RAM, and a GTX 260 Core 216... + 3 HDDs and 1 Optical Drive.

    This card comes out just days before my step-up program expires, and I really want to upgrade so long as my PSU will run it safely! Thoughts?
  29. 18 amps per rail is cutting it close. Ive seen from somewhere the usage is 289 watts, or 20 more than the 4870, not sure tho on this, as its unofficial
  30. My pair of 280's arrive tomorrow. Since this news, I'm returning them both to Newegg unopened. I can wait a few more weeks and pay a few more dollars to get a couple of the 295's instead. With my new i7 OC'ed to 4.5Ghz, I think it can handle a pair of these cards.

    Wish I would have known earlier this week when I ordered but I guess that answers the question why the 280's where so cheap on sale and with a rebate.
  31. dude keep the 280's. dont be a beta tester on this one. Nvidia has alot of improving to do on this specific platform.
  32. roofus said:
    dude keep the 280's. dont be a beta tester on this one. Nvidia has alot of improving to do on this specific platform.

    Judging by those numbers, if beta 295's are already taking 4870X2's lunch money I can live with that as they can only improve with time.
  33. I'm really impressed with how much power the die shrink saved for the 200 series. I think this was another case of something that was meant for 55nm but forced into 65nm (like the 2900, should have been 55nm, but forced into 80nm). I don't really like the dual card, but if I was buying right now I think I'd have to pick the 55nm 260 over the 4800 (but, I'm quite happy with my 4850. The only card purchase I may do anytime soon is a second 4850).
  34. if they do lunch this for 499 USD, its a very good buy
  35. This card is good only because it will drive down the price of 4870/50 x2s so they will be a even better deal. So it takes that lead that's good but it's not like you'll complain or probably notice 90fps vs110fps.
  36. o comon seriously did u guys think the 4870 x2 would remain the highest performing card till eternity...obviously nvidia has released a card which is faster then the 4870 x2, otherwise it wouldnt make sense if it was slower...but what the toms article doesnt say clearly is that the 4870 x2 has dropped to around $460 after MIR on newegg at this very its no longer ~$500
  37. i wonder why nvidia are going to sell this for 499 USD, could it be they lost some respect in the 4870x2 and the 280 battle, and are looking to score some brownie points
  38. Ha, I'll believe $499 when I see it. What is the cost right now of 2xGTX260?
  39. Regular 260's are 220.00, so 440. 216's are 249 so 498. Seems like same performance same price, but allows quad sli which they stopped supporting?=x

    I'm afraid to buy a new GPU right now, building my I7 with just 1 gtx260 and not buying anything for a month or so until everything is figured out.
  40. I'm more curious to see what the 280 GTX will go down to...?
  41. What do you think ATI next move is? far as this new GTX295 looks good I am sure NVIDIA will produce more 55mm but more mainstream as they have done before. This is to show who has tops guns :o I'm looking at what ATI has done so far It would lead me to believe they would do a die shrink. 2000's to 3000's die strink 3000's to 4000's beef the meat up.
  42. and this is new to u why? LOL:P
  43. LOL because I am 10
  44. The sandwich Ideal is rather lazy?! now it is internal SLI literally they stuff that connector in it lawls
  45. well sandwhich is easier, quicker, and cheaper. Less inovation means less effort, time and cost:D

    2 times to make sure;) The Nvidia Way!
  46. L1qu1d said:
    well sandwhich is easier, quicker, and cheaper. Less inovation means less effort, time and cost:D

    2 times to make sure;) The Nvidia Way!

    good point but then how much more efficient is it compared to a two slot confiquration
  47. L1qu1d said:
    well sandwhich is easier, quicker, and cheaper. Less inovation means less effort, time and cost:D

    2 times to make sure;) The Nvidia Way!

    dont even talk to me about the nvidia way
  48. it can go shoot it self "the nvidia way" as a statement
Ask a new question

Read More

Graphics Cards Gtx Geforce Nvidia Graphics