Sign in with
Sign up | Sign in
Your question

Quad sli 295 GTX The most powerful setup available

Last response: in Graphics & Displays
Share
January 8, 2009 4:07:43 PM

I think he was excited about getting to do the review =P
(I would have been too=)
What a great result though, some impressive numbers.
January 8, 2009 4:11:58 PM

yup but very uneccesary numbers atm...Crysis doesn't use more than 3 GPUS...so we'll see nothing new there. As for every other game (well most) You won't need anything more than 1 4870 or 260 GTX (depending on resolution even lower).

My laptop is running a video card almost as good as a 9600 GT and I can max out most games. A handful that need some tweaking:) 

But if ppl like big e-penises :p  :D 


And yes he was very excited ...I would be 2:D 
Related resources
a c 169 U Graphics card
January 8, 2009 4:14:20 PM

Thanks for the link L1qu1d
January 8, 2009 4:22:57 PM

No Problemo:) 
January 8, 2009 5:32:02 PM

its just to expensive liquid, well it is in the uk £392 that = 595 in dollars, god i hate it when nvidia are on top, they just take the piss, on a side note you can pick up the 4850x2 for £259 and the 4870x2 for £339, and if what ive been hearing is true they have put up the price of the 260, since the release of there big bang drivers, and to me that is just unacceptible, and i also find the price of the 4870x2 unacceptible.
January 8, 2009 5:45:37 PM

Good point ranger, I was routing for ATI ATM as well, Now Nvidia prices might go up again just like they did when the 260 GTX+ was introduced....the 280 GTX went from 400$ here to 449.99....

i said it before and I'll say it again...dual GPU cards don't impress me anymore..after having a mouth full of problems with my old 9800 GX2s ( i hope I never see em again)...I'm not going to switch to duals ever...I think 2 260s would be smoother than 1 295 GTX just like 2 4870s will be smoother than 1 4870 X2.

But thats my opinion. Quad setups just can't see the light yet.

I'd say 3 GPUs max...lets see what the 280 revisions bring.


I think every1 will see em pleased when the HD 5000s and the 300 GTXs are out with dx 11.

But we won't be impressed agian, because the dual GPU cards ruined it for us...we already kno what high power is...and its just going to be the 280 GTX vs the 9800 GX2 agian for bother companies:) 

All this applies to my MAIN computer...my secondary...I don't really care about min framerates its just a test computer:)  and a lan party computer.
a b U Graphics card
January 8, 2009 5:51:16 PM

i might be selling my soul to the devil next week for this.

anyway to be more realistic, i cant help but ask. they (not guru3d)said that a single 295 is a pair of cards with performance between a 260-216 and a 280 on a smaller die size. will they sell it single?
January 8, 2009 5:56:18 PM

you shouldn't have to pay more than £299 for a top end card, but we live in bizzaro world where greed is the norm
January 8, 2009 7:06:58 PM

wh3resmycar said:
i might be selling my soul to the devil next week for this.

anyway to be more realistic, i cant help but ask. they (not guru3d)said that a single 295 is a pair of cards with performance between a 260-216 and a 280 on a smaller die size. will they sell it single?


I doubt it, the GTX295 uses the full 240 core GPU (that the GTX280 uses) and combined it with a 448 bit bus (That the GTX260 uses) in order to cut down on the PCB cost. But there will be 55nm GTX280s and GTX260s coming.

rangers said:
you shouldn't have to pay more than £299 for a top end card, but we live in bizzaro world where greed is the norm


On the plus side this could start another pricing war, ATI will probably drop the cost of the HD4870X2 to be more attractive and Nvidia may follow suit.
January 8, 2009 7:43:57 PM

You say that about all the drivers hahaha...it won't push it above the 295 GTx unfortunately since its almost 2 280s put together...thought I'm surprised that the 280s actually did significantly better sometimes
a b U Graphics card
January 8, 2009 8:03:18 PM

Did you read my link?
January 8, 2009 8:28:57 PM

a 4870 being better than a 280 GTX...which is a flagship Single GPU?

I'll wait for some reliable soruces...I remember the gains the 8.11 drivers were supposed to bring...and they brought almost nothing yet these are supposed to bring us in Crysis a Nvidia game about 22% increase...

I say no...I think this time I won't get my hopes up about these drivers like I did for the 8.12 betas for my 4870 X2.
a b U Graphics card
January 8, 2009 8:45:44 PM

If you followed those gains, youd see they were only on a few games, just like BB2 was, and as Anand showed, the BB2 drivers actually were worse in other games, not the selected 5 games. Look for these drivers, as the x2 will love them
January 8, 2009 8:48:14 PM

i did and the 4870x2 comes out on top in some of the tests, the 260x2 comes out on top in most
a b U Graphics card
January 8, 2009 8:53:55 PM

Anyways, this is the site here http://64.233.187.100/translate_c?hl=en&sl=fr&tl=en&u=h...
Realise that its a cat n mouse game for these cards, as as one makes a move, the other counters. There is no domination here. If people are so excited about a simple driver release, why arent they excited about HW? Like DX10.1 capabilities? No driver can change that, only HW. Drivers can be rewritten, HW, you need to start all over for. Also, coming soon, some power control coming in future drivers as well. Just a heads up is all, and also, pointing out the 295, which is loud and expensive, has come too late for the party, and doesnt perform exceptionally well over its competition, but it is a compelling answer for those that only see green
January 8, 2009 8:56:46 PM

like I said, when I see anandtech do a reivew or any other credible site...I'll start getting my hopes up. If I learned anything from the past couple of years from both Nvidia and ATI, its hope for alot but expect little:) 

Anyways I'm play some ps3 bye bye:D 
a b U Graphics card
January 8, 2009 9:00:45 PM

In a 100$ difference, the card doesnt make me want to go and grab it. If it totally dominated, and had clear wins by good margins, then yes, its a go, but it isnt like that, its a better G216 vs a 4870 is all, and itll win a few more than lose, but nothing earthshaking. I just wish nVidia had taken its time to do it even better, and get away from their sandwitch card design. Its an ok card, but too expensive for its performance
a b U Graphics card
January 8, 2009 9:14:00 PM

Yeah, you can really tell that the 295 was just thrown together to retain the performance crown. Not that it isn't a good card, its just too much hardware mashed together to be affordable. I wonder what NVidia will do for their next real series? They now respect ATI after the 4000s, so will they plan dual GPU from the start or will they assume an even more humongous chip will beat the 5000s. Should be interesting at least.
a b U Graphics card
January 8, 2009 9:33:09 PM

Rumors have it that its a larger chip design. That isnt a total failure design approach, it just makes it hard to do it ATIs way, and if thats better for a halo product, Im afraid we will conrinue to see this as nVidias approach to its halo product
a b U Graphics card
January 8, 2009 9:39:10 PM

Well, the large chip design is good as long as they use a small man. process. GTX280 should have been 55nm from the start. Looks like they learned that lesson though as they are going straight to 40nm.
January 8, 2009 10:12:12 PM

i think i read some where that there next card is going to be dx11, and i dont think nvidia will pull it off, they couldn't even get a proper dx10 card out, microsoft had to relax its dx spec, i think you can chalk the next one up for ATI
January 8, 2009 10:25:29 PM

i already have dx 11 on my computer:D  lol just no card ahaha:p 
January 8, 2009 10:33:30 PM

not heard from you for some time liquid, whats my nemesis been up2 then (world domination) LOL
a b U Graphics card
January 8, 2009 10:37:43 PM

Itll be interesting, as Im sure ATI hasnt just included their tesselation unit without testing it some, and may also have an advantage there as well. Programmable shaders, and their effect on the design may prove to have changes for both, or give one an advantage as well also
January 8, 2009 10:55:54 PM

rangers said:
not heard from you for some time liquid, whats my nemesis been up2 then (world domination) LOL



Busting my ass at school, and playing as much as I can in between:)  my birthday is coming up in 12 days...and its sad to say its about the only thing to look forward to:) 

I leave my Global Domination to Pinky and the Brain:) 
January 8, 2009 10:59:13 PM

i detest the multi-gpu cards (no discrimination, hate them equally). CF or SLI i am ok with so i guess i am a hypocrite. just seems if you have troubles, you only risk losing a single card and still have a pc, not to mention they are much easier to cool.
January 8, 2009 10:59:24 PM

if dx11 stops nvidia sacrificing image quality for FPS ill be a happy man, that is one of the benefits of dx11, am i right?, think i came across it some where
a b U Graphics card
January 8, 2009 11:02:47 PM

Link?
January 8, 2009 11:06:21 PM

had words with some one else on here, it wasn't the same, came away feeling dirty, it felt like i was cheating, LOL
January 8, 2009 11:08:23 PM

dont ask jaydee cant remember
January 8, 2009 11:09:35 PM

rangers said:
if dx11 stops nvidia sacrificing image quality for FPS ill be a happy man, that is one of the benefits of dx11, am i right?, think i came across it some where


During the 5, 6 and 7 series this was a valid complaint. With the 8 series Nvidia actually pulled ahead of ATI in terms of image quality. Right now there's probably no noticeable difference, I haven't seen any recent reviews complaining about either at least.
January 8, 2009 11:10:30 PM

lol :)  well i was skiing in Quebec and Vermont for a month:) .

But to your whole quality thing, yes I remember Nvidia pulling it with the 7 series...and they cheated in 3dmark06 with 8 series but now they are even from reviews...ppl couldn't tell the difference other than brightness contrast...etc.. But I think Anandtech did a review about this and it came out even for both companies:) 

Honestly I am happy with Ps3 graphics...I think i'm getting less picky when it comes to graphics ahahah:) .

Right now I'm actually on the fence between phenom 2 or Intel i7:)  I think CPUs are now more important than Quad sli/Crossfire cards:D .ies
January 8, 2009 11:52:27 PM

look the way i see it is, nvidia will never NEVER play fair, they have to have the fastest card at all cost, and it will take microsoft to stop it, you buy a high end card
so you can have all the eye candy on, you dont need nvidia messing with your enjoyment
January 9, 2009 12:07:09 AM

both are over a year ago:)  lol

"this certainly seems like more than just an accidental driver issue." techpower

like I said I heard something about 8 series here and there...but it was only the drivers at that time, nothing after:)  and it was only 1 or 2 drivers.

lets nots forget the 2900 XT, which really wasn't a choice to go anyways at the time...so really this didn't hurt any1:) 

January 9, 2009 12:34:26 AM

I would think that ATI would already move onto DDR6 lol they are always 1 DDR ahead and 1 frame behind:p 
January 9, 2009 12:42:59 AM

does DDR6 exist?
January 9, 2009 1:11:29 AM

Not that I kno of:p 
January 9, 2009 1:15:29 AM

rangers said:
look the way i see it is, nvidia will never NEVER play fair, they have to have the fastest card at all cost, and it will take microsoft to stop it, you buy a high end card
so you can have all the eye candy on, you dont need nvidia messing with your enjoyment


ATI has done the same thing in the past, I remember that ATI added some "optimizations" to help quake 3 performance for the 8500 at the cost of image quality. All the reviewers had to do was rename Quake3.exe to Quack3.exe and the image quality would go back to normal and would it run slower. Both companies have done this to an extent, Nvidia has just gotten kind of a reputation for it since the 5 series.

L1qu1d said:
I would think that ATI would already move onto DDR6 lol they are always 1 DDR ahead and 1 frame behind:p 


GDDR5 will probably last awhile, there's already GDDR5 that can reach above 4Ghz effective so it doesn't look like we'll hit the ceiling anytime soon.

Edit: Actually they can get it up to 7Ghz so far - http://www.reghardware.co.uk/2008/11/25/hynix_7ghz_gddr...

kelfen said:
does DDR6 exist?


Nope, I don't think they've even come up with the specification for GDDR6 yet. Also keep in mind that the specifications for GDDR are a little different than DDR. DDR4 doesn't exist yet but GDDR4 does, the generations don't really match up.
January 9, 2009 1:18:02 AM

I'm just curious, theres a GDDR4 I remember because one of the X1900s I remember uses it. I wonder why ATI chose to skip it after, and why Nvidai didn't bother even with 1 model, they just went to GDDR5 (well for the next 300 series).
a b U Graphics card
January 9, 2009 1:25:24 AM

ATI found the returns just werent there, and nVidia couldnt use it, as their was too much signal noise from their wires, and would have created alot of changes
January 9, 2009 1:39:15 AM

I'm not entirely sure, but GDDR4 didn't offer that many advantages over GDDR3. I remember reading somewhere that GDDR4 offered little to no benefit on their design meanwhile ATI's card could take better advantage of it.

After some quick googling: http://www.maximumpc.com/article/features/the_evolution...

Quote:
GDDR4’s 8-bit burst length might be one reason Nvidia ultimately passed on this type of memory: Nvidia’s processors support only 4-bit burst lengths.


So Nvidia couldn't really use it even if they wanted to. ATI probably stopped using it because it wasn't really worth the extra cost over GDDR3 plus they had the option to use the superior GDDR5.
January 9, 2009 2:27:56 AM

Makes sense. Lets hope GDDR5 will be used fully soon :) 

a b U Graphics card
January 9, 2009 3:11:33 AM

JAYDEEJOHN said:
http://translate.google.com/translate?hl=en&u=http%3A%2...
Hmmm, those cat 9.1 drivers look preety good


speaking of the drivers, does ccc has "multi core cpu support?" (or was it optimization). because i remember i saw that option back when i was on a geforce.

sorry didnt read the link, im on a hurry.
!