Sign in with
Sign up | Sign in
Your question

Nvidia GeForce 7950GX2 Launches

Last response: in Graphics & Displays
Share
June 5, 2006 7:49:37 PM

Nvidia takes what that created in the form of GeForce 7900GX2 and shrinks it down for the rest of the world. What do you think of Nvidia's latest creation?

Discuss the article and this topic here.

http://www.tomshardware.com/2006/06/05/geforce_7950_gx2...
June 5, 2006 7:50:55 PM

wow lets add another thread to the 2 already posted!
June 5, 2006 7:51:16 PM

First of all, this has been posted a million times, we don't need more.

It's just two 7900GTs on one PCI-E board, it's nothing revolutionary. No new technologies are used, besides the possibility of having quad SLI, which is a total waste.
Related resources
June 5, 2006 7:59:44 PM

Guys, Darren Polkowski is the Tom's Hardware resident graphics card guy... he writes all the tasty reviews on new hardware.

So a little respect is in order at least, eh fellows?
June 5, 2006 8:01:57 PM

Well sorry for that first post D. However, this isnt what people are wanting in my own opinion.

lets just make this the GX2 discussion....
June 5, 2006 8:03:31 PM

Shouldn't this discussion be linked to an article though? You guys do plan to do a full blown article on the GX2, right? If so, this discussion/thread should have waited until then.

Just my opinion though. :-)
June 5, 2006 8:09:34 PM

I think Darren will link the article to this, he just posted the thread before the article went live so that the article wouldn't link to nowhere.

One of the things we requested that they do at Tom's is to link articles to discussion threads so we can talk about the reviews... Darren is just trying to comply with our request. :) 
June 5, 2006 8:39:27 PM

With directx 10 coming out soon. Is it worth buyiung this card? Could this be a signal that directx 9 cards will be supported by Vista???
June 5, 2006 8:44:54 PM

I wouldn't call it a "signal."

But rest assured, DX9 will work fine with Vista and all new games. And I still think M$ will be forced to release (some form of) DX10 on XP... no matter what they say.
June 5, 2006 9:16:14 PM

Quote:
I wouldn't call it a "signal."

But rest assured, DX9 will work fine with Vista and all new games. And I still think M$ will be forced to release (some form of) DX10 on XP... no matter what they say.

You serious? They will never do that since they can make DX10 only Vista, and pay game makers to make games for Vista and DX10 ony so people would be forced to buy Vista. Therefore, M$ makes more money.
June 5, 2006 9:22:51 PM

im still waiting for matrox to bring out new cards :lol:  :lol:  :lol: 
June 5, 2006 9:38:33 PM

What there is a company called Matrox?...lol
June 5, 2006 10:09:42 PM

Matrox is out of the card making business.

The still are doing analog and digital devices but not graphics cards.
a b U Graphics card
a b Î Nvidia
June 5, 2006 10:14:20 PM

Quote:
Guys, Darren Polkowski is the Tom's Hardware resident graphics card guy... he writes all the tasty reviews on new hardware.

So a little respect is in order at least, eh fellows?


Nah man, he should link to his own review or else be treated like the rest of the plebs! :twisted:

The least he coulda done was make his 'the article' a hyperlink. :wink:

Die n00b!e DIE! :tongue:
June 5, 2006 10:19:40 PM

I love you too Grape.

I needed to post a forum link and then it got added to the bottom of the page of the article.

Thanks for the hazing. Please sir, may I have another.

:p 
a b U Graphics card
a b Î Nvidia
June 5, 2006 10:25:48 PM

Quote:
Matrox is out of the card making business.

The still are doing analog and digital devices but not graphics cards.


Dude they still sell graphics cards, a little reasearch, eh!
It's not all about gaming despite what most of us focus on. :roll:

They don't have a newly launched products as ATi and nV, but they still sell their current line-up of P and G series cards, which still pretty much the tops in 2D and multi-monitor support.

Considering they shipped more than 75,000 cards in the first quarter of this year, that's pretty good, and more than 3DLabs who arguably have a more recent refreshed line.

And while niche to say the least, they still make cards, so don't have the wake before they're actually gone! :evil: 
a b U Graphics card
a b Î Nvidia
June 5, 2006 10:31:11 PM

Quote:
I love you too Grape.

I needed to post a forum link and then it got added to the bottom of the page of the article.


Oh I know why it's here, it just made me laugh how it appeared and the reception it got, and I'll pile on of course because you slight my beloved Matrox! :tongue:

Quote:
Thanks for the hazing. Please sir, may I have another.

:p 


LOL! You got busted by the filter!

You're welcome, as always, I'll edit if need be, just piling on of course. Just made me laugh, it's all tongue in cheek. :twisted:
June 5, 2006 10:33:07 PM

Okay... guess I need to rephrase that...

"They don't sell graphics cards most people in this forum would ever care to own."

Granted things for multi-monitor support they are a card to reckon with. But I highly doubt that even you would need a 4 monitor support up to 16 monitors by using from 4 plug in PCI cards or a PCIex1 card for your personal use. However, I could be wrong. Some people do the craziest things.
June 5, 2006 10:39:18 PM

Actually, I loved Matrox. I still have some of their cards of old. Being a pack rat of old PC items I have a bunch of old things. I still use an S3 Virge/DX sometimes. Ironically there have been motherboards from Asus, MSI, Gigabyte, and a host of others that were anal about needing a PCI card to boot, even when the PCIe selection was made in the BIOS. Weird BIOS issues from NF4 early days. It just proves that old is not bad... just old.
June 5, 2006 11:03:38 PM

That was an Oblivion benchmark, and Oblivion runs better on ATI cards.
June 5, 2006 11:06:43 PM

I take it you care about Oblivion performance.

For the testing I use one of the most insane outdoor scenes I could find. It is the transition from day to night and with the settings as such.

Resolution: 1024x768, 1280x1024, 1600x1200, 2048x1536, 2560x1600
Brightness: Default (Leave this where the game starts it at)
Texture Size: Large
Tree Fade: Max (All the way to the right)
Actor Fade: Max
Item Fade: Max
Object Fade: Max
Grass Distance: Max
View Distance: Max
Distant Land: On
Distant Buildings: On
Distant Trees: On
Int. Shadows: Max
Ext. Shadows: Max
Self Shadows: Off (These look like crap and just aren’t working right yet. Makes everyone look like they have beards… even the ladies… *shivers*)
Shadows on Grass: On
Tree Canopy Shadows: On
Shadow Filtering: High
Secular Distance: Max
HDR Lighting: Off (HDR and AA cannot be done together in the game with Nvidia. ATI can but we will revert to the lowest common denominator)
Bloom Lighting: On
Water Detail: High
Water Reflections: On
Water Ripples: On
Window Reflections: On
Blood Decals: High
Anti-aliasing: 4x

Right now the only cards that can handle Oblivion at high settings is in the cost range of what Kelt Reeves at Falcon Northwest calls the "Elite Bastards Group." Basically those with significant dollars to spend on high end hardware. Besides... even if you want to run at the resolutions that less than 10% play at, you need to spend some serious coin on the monitor.
June 5, 2006 11:13:58 PM

Quote:
I take it you care about Oblivion performance.

For the testing I use one of the most insane outdoor scenes I could find. It is the transition from day to night and with the settings as such.

Resolution: 1024x768, 1280x1024, 1600x1200, 2048x1536, 2560x1600
Brightness: Default (Leave this where the game starts it at)
Texture Size: Large
Tree Fade: Max (All the way to the right)
Actor Fade: Max
Item Fade: Max
Object Fade: Max
Grass Distance: Max
View Distance: Max
Distant Land: On
Distant Buildings: On
Distant Trees: On
Int. Shadows: Max
Ext. Shadows: Max
Self Shadows: Off (These look like crap and just aren’t working right yet. Makes everyone look like they have beards… even the ladies… *shivers*)
Shadows on Grass: On
Tree Canopy Shadows: On
Shadow Filtering: High
Secular Distance: Max
HDR Lighting: Off (HDR and AA cannot be done together in the game with Nvidia. ATI can but we will revert to the lowest common denominator)
Bloom Lighting: On
Water Detail: High
Water Reflections: On
Water Ripples: On
Window Reflections: On
Blood Decals: High
Anti-aliasing: 4x

Right now the only cards that can handle Oblivion at high settings is in the cost range of what Kelt Reeves at Falcon Northwest calls the "Elite Bastards Group." Basically those with significant dollars to spend on high end hardware. Besides... even if you want to run at the resolutions that less than 10% play at, you need to spend some serious coin on the monitor.
No , the only thing that can handle oblivion is the 360, for 399 with my monitor. Instead of paying over 600 to get 35 fps. They have the technology, but want the consumers money first. If your happy with that, theres something wrong with you.
a b U Graphics card
a b Î Nvidia
June 5, 2006 11:25:32 PM

Quote:

Granted things for multi-monitor support they are a card to reckon with. But I highly doubt that even you would need a 4 monitor support up to 16 monitors by using from 4 plug in PCI cards or a PCIex1 card for your personal use. However, I could be wrong. Some people do the craziest things.


Nah I still just like their SurroundView concpet, and while initially card dependant with the Parhelia, with the launch of the TripleHead2Go it's available to all the gaming community in one for or another. However my support of them is for their 2D, but even that is being challenged by ATi.


PS, FYI the Oblivion self-shadow is fixed with the Bethesda Beta patch. Looks really good actually.

And PPS, LB, this isn't an Xbox thread. Xbox =/= PC leave it at that, for whatever reasons you promote Xbox I can give you as many promoting PC, and vice versa even. Both have their strengths, and it's far from trouble free on the Xbox. :roll:
June 5, 2006 11:26:32 PM

Quote:
Matrox is out of the card making business.

The still are doing analog and digital devices but not graphics cards.



wow....no one got the joke......


sigh
a b U Graphics card
a b Î Nvidia
June 5, 2006 11:30:58 PM

Quote:
Matrox is out of the card making business.

The still are doing analog and digital devices but not graphics cards.


wow....no one got the joke......


Didn't sound or look like he was making a joke since alot of their focus has changed to their analogue and digital editing devices, so that would seem like a direct statement not a play on words equating their remaining cards to simple 'devices'.
June 5, 2006 11:33:32 PM

no i said "im still waiting on matrox"

he replied wit the analog device statement.

i meant no one got my "im waiting on matrox joke"
a b U Graphics card
a b Î Nvidia
June 6, 2006 12:17:12 AM

Aha, yes I got that part, I think we've discussed the impeding mega DX11 card of their before. 8)

Missed what you meant by... missed the joke, and that one I did get, smiling and passing on.

Ma faut! :?
June 6, 2006 1:07:25 AM

This review was pure farce. It should have been a preview, cause that's basically all we got.

This card should be compared to all cominations in the same price range. It doesn't really matter if it's one card or two. It matters that it costs $600.

Thus, it should be compared to Crossfire and SLI combinations that equal ABOUT $600 dollars. Or, to be more thorough, THG should compare the scores with any SLI combination, since this card is compared to the hardcore gamer who's willing to spend big bucks for graphics.
June 6, 2006 5:04:57 AM

Sooo......I have a GTX right now, would it be an upgrade or not to go with a GX2??? Considering it has twice as much memory I thought it was a no brainer, but after seeing some of the benchmarks im not so sure. I just barely purchased the GTX so I could take it back and upgrade for around $70. I do have an SLi board, so my next question is would 2 GTX's be better than 2 GX2s?? LOL!! Thanks in advance for the input.
June 6, 2006 8:43:52 AM

Its obvious that FX60 cant handle the GPUs on this graphics card!! in DOOM3 NoAA 8xAF, frame rates are just stuck around the 136 mark from 1024 x 768 to 1600 x 1200...........A more powerful processor is needed to show what this dual-vga as one beast can do!!!
But not that a lot of people is gonna go out and buy one of hese anyway!!
These card will have a month or two before its branded as OLD TECHNOLOGY by the same company that advertised it as the KING OF VGA!!!!
June 6, 2006 12:49:43 PM

I would like to see the PCI-E slot on the front board rather than the back board so I can get my damned PCI slot back. I've got a Micro ATX mobo and I have 2 PCI slots and a PCI-E. There's a mount above the PCI-E slot that cant be used because of the orientation of video cards, so why don't they just flip it so we can get our damned PCI slots back?
June 6, 2006 1:05:32 PM

I noticed a discrepancy between the THG article and the one on Firingsquad.

You say:
Quote:
As with Quad SLI, we do not see high volume sales in the future for Nvidia, but this could let people upgrade to a dual GPU platform more easily, as they would not have to change to an SLI motherboard.


Firingsquad says:
Quote:
Keep in mind that you’ll still need an nForce SLI motherboard to support SLI with the GeForce 7950 GX2. If you plug a GeForce 7950 GX2 card into an nForce4 Ultra or nForce 570 Ultra motherboard, the card will work, but only one of the GPUs on the board will function. You’ll need to upgrade to an SLI motherboard if you want both GPUs on your GeForce 7950 GX2 card to operate in SLI mode.


So which one is true?
June 6, 2006 1:31:58 PM

From what I have read, the GX doesnt need a SLI motherboard. This a reason that Nvidia came out with the GX2 to let users that dont have an SLI mobo to upgrade and have about the same if not better performance.

Quote:
he GeForce 7950 GX2 will work on any motherboard with a PCIe graphics slot. SLI motherboards are not required. The system will detect it as a pair graphics cards installed, but the drivers will see it as only one card with no messy SLI profiles or settings required.


Hardware Zone

The above link is a direct link to the information provided above.

This is another source that proves, no you dont need an SLI mobo.
June 6, 2006 1:52:17 PM

Considering the X1900XTX and 7900GTX used to cost near $600 or more for the money the card isn't badly priced. I mean you're getting two cards with only a single PCI-E slot and you'll be able to use your PCI slots under the 2nd PCI-E x16 slot. Granted a Crossfire X1900XT(X) or 7900GTX setup might perform a little better but they also cost a minimum of $150 more and need a dual slot motherboard.

Really its not that bad of a deal. I just wish ATI would get down to 80-90nm so their cards run cooler and release the same thing. I'd buy it. And since their physics setup is supposedly going to need 3 video cards, it'd mean I only need two PCI-E x16 slots instead of three. Of course I don't understand why we can't have 1 X1900XT(X) for graphics and 1 X1600 for physics. Why do we have to have 3 freakin cards. Thats a lot of heat.
June 6, 2006 2:18:42 PM

I think it is GREAT that this is so much cheaper than buying two 7900 cards! I think it is great you could use this in quad mode.

I agree with the post below, I would wait for a Directx 10 card of this type to come out though but if I bought a card today considering this is not much more than a standard card 7900 it would be a very nice choice.

My BIGGEST concern is that I proably wouldn't be able to replace the stock fans with quieter zalmans or other quieter coolers.

I would hope that ATI made a similar card but with an option for replacing the coolers. I'm wondering if ATI made such a card if they could get better efficiencies to leverage the second card than they do with their current crossfire limitations.
June 6, 2006 5:20:09 PM

Quote:
I love you too Grape.

I needed to post a forum link and then it got added to the bottom of the page of the article.

Thanks for the hazing. Please sir, may I have another.

:p 


Careful what you ask for. You might end up missing both eyes like Action Man.
June 6, 2006 5:23:30 PM

Quote:
Considering the X1900XTX and 7900GTX used to cost near $600 or more for the money the card isn't badly priced. I mean you're getting two cards with only a single PCI-E slot and you'll be able to use your PCI slots under the 2nd PCI-E x16 slot. Granted a Crossfire X1900XT(X) or 7900GTX setup might perform a little better but they also cost a minimum of $150 more and need a dual slot motherboard.


I would have liked to have seen XFired X1900XT, SLI 7900GTX and quad SLI benchmarks tossed up alongside the data shown in this article, just to put a frame around the ballpark this stuff competes in. (interpretation: I'm too lazy to go dig it up right now and besides, I like to see it all plotted together)
June 6, 2006 6:59:29 PM

Beware of seriously long post:

This new card has to be seen as a SINGLE card. It does not need to be in SLI mode Here is a list of motherboards that are support Nvidia's 7950GX2 cards. Obvioulsly there are chipsets that are not SLI certified nor support two PCIe slots that can support the 7950GX2. I spoketo Bryan DelRizzo at Nvidia today and there are boards that are beyond this list that might support the 7950GX2 without an updated BIOS. One reader asked us about the Abit AA8XE and I have a message into Nvidia's support test labs to find out. What I think Brandon means is that when you want Quad (or running two of these together) you need to have an SLI motherboard. It is something you should ask Brandon as I cannot put words into his mouth.

You can find Nvidia's latest listing on their website: http://www.nvidia.com/content/geforce_gx2_sbios/us.asp


1. Motherboard Manufacturer
2. Motherboard Model
3. Chipset
4. Required System BIOS (Or higher)

ABIT
AA8 Duramax
Intel 925X
2.4

ABIT
AW8
Intel 955X
1.4

ABIT
AN8 SLI
NVIDIA nForce4 SLI
1.9

ABIT
K8N SLI
NVIDIA nForce4 SLI
17

Albatron
K8SLI
NVIDIA nForce4 SLI
R1.12

ASUS
A8N32-SLI Deluxe
NVIDIA nForce4 SLI
1205

ASUS
A8N5X
NVIDIA nForce4
0902

ASUS
A8N-E
NVIDIA nForce4 Ultra
1013

ASUS
A8N-SLI Premium
NVIDIA nForce4 SLI
1013 (available soon)

ASUS
A8R32-MVP Deluxe
ATI CrossFire Xpress 3200
0404

ASUS
A8V-E Deluxe
VIA K8T890/VT8237R
1005

ASUS
M2N32-SLI DELUXE
NVIDIA nForce 590
0404

ASUS
P5LD2
i945P
1103

ASUS
P5LD2-VM
Intel 945G
0508

ASUS
P5ND2-SLI
NVIDIA nForce4 SLI Intel Ed.
0304

Biostar
i945 G-M7
Intel 945G
24F

DFI
Infinity NF4 SLI
NVIDIA nForce4 SLI
2006/04/10

ECS
KN1 SLI Lite
NVIDIA nForce4 Ultra
1.1d

ECS
nForce4-A939
NVIDIA nForce4
1.1g

Foxconn
NF4SLI7AA-8EKRS2
NVIDIA nForce4 SLI Intel Ed.
537F1P34


Foxconn
C51XEM2AA-8EKRS2H
NVIDIA nForce 590
612W1P19

Gigabyte
GA-8I945P PRO
Intel 945P
F5

Gigabyte
GA-8N-SLI-Quad Royal
NVIDIA nForce4 SLI Intel Ed.
F5

Gigabyte
GA-K8N Pro-SLI
NVIDIA nForce4 SLI
F4

Gigabyte
GA-K8N51GMF
NVIDIA nForce 410/GeForce 6100
F7

Gigabyte
GA-K8N51PVM9-RH
NVIDIA nForce 430/GeForce 6150
F1

Gigabyte
GA-K8NF-9
NVIDIA nForce4
F10

Gigabyte
GA-K8N-SLI
NVIDIA nForce4 SLI
F9

Gigabyte
GA-K8NXP-9
NVIDIA nForce4 Ultra
F9

Gigabyte
GA-K8NXP-SLI
NVIDIA nForce4 SLI
F11

Intel
D955XBK
Intel 955X
2036

Intel
D975XBX
Intel 975X
1073

Intel
SE7525GP2
Intel E7525
P08

iWill
DK8EW 59102
NVIDIA nForce4 Pro 2200
V130

MSI
K8N Diamond Plus
NVIDIA nForce4 SLI
A7220NZ1 v1.22 4/18/06 (available soon)

MSI
K8N Neo4 Platinum v1
NVIDIA nForce4 Ultra
1.D

MSI
945P NEO2-F
Intel 945P
A7176IMS v3.2 12/12/05

Supermicro
X6DA8
Intel E7525
1.0B

Tyan
S2895 D/T
NVIDIA nForce 2200/2050
2895_103
June 6, 2006 7:23:25 PM

I know that people want to see XFire and SLI against this card as it is practically SLI on a stick. However, since it does not require a special motherboard and that it is being positioned as a single card we should treat it that way. Nvidia even claims that SLI and CrossFire should beat out the GX2 as it is underclocked compared to the other cards clock for clock. This is where they are making their move to multi-gpu on a single card. It kind of reminds me of the 3DFX days when Voodoo2 SLI came back into Voodoo 5.

We are working on a full run down of the 7000 series cards as well as and ATI variant just to get a full perspective on each family.

What things would you like to see? shoot me an email at darren@tomshardware.com
June 6, 2006 8:03:24 PM

Darren is there a way you could find out about the Biostar NF4SLI-A9 motherboard supporting this GX2?
June 6, 2006 8:10:54 PM

Quote:
From what I have read, the GX doesnt need a SLI motherboard. This a reason that Nvidia came out with the GX2 to let users that dont have an SLI mobo to upgrade and have about the same if not better performance.

he GeForce 7950 GX2 will work on any motherboard with a PCIe graphics slot. SLI motherboards are not required. The system will detect it as a pair graphics cards installed, but the drivers will see it as only one card with no messy SLI profiles or settings required.


Hardware Zone

The above link is a direct link to the information provided above.

This is another source that proves, no you dont need an SLI mobo.

I think you are right on this. I think they are pushing this for a quick fix for conroe until the nForce 590 intel sli motherboards come out. And even then nForce 590 supports ddr2 667 where as the 965 chip set supports ddr2 800(but not sli ready).
June 6, 2006 8:43:39 PM

I have another message into Nvidia and one with Biostar. Computex is going on so I am not sure how fast I will get a response back from Boistar.

We shall see.
a b U Graphics card
a b Î Nvidia
June 6, 2006 10:03:37 PM

Quote:
he GeForce 7950 GX2 will work on any motherboard with a PCIe graphics slot. SLI motherboards are not required. The system will detect it as a pair graphics cards installed, but the drivers will see it as only one card with no messy SLI profiles or settings required.


Ok, the thing that's WEIRD about that is how does the app/driver know which rendering type (AFR/SFR) to use without all the 'messy SLi' stuff?

Does the card default to AFR for all situations and do the load balacing on the card?

Seems a little simplistic to say the least to just say 'it's detected as one card' because the mechanics behind how SLi works depends alot on how the system or card handles the ineraction of those two cards, and technically they aren't using the SLi bridge on the card itself, they are communicating across the lanes.
June 6, 2006 11:03:35 PM

Got word back from Nvidia on that Biostar motherboard. It has not been tested by Nvidia's labs for compatability but that does not mean the board will not work with GrForce 7950GX2. I would suggest that you wait until you know for sure if you are thinking about purchasing that card. Nvidia thinks that your board my work with it without an update. I would send a personal message to Biostar to get their answer. (I will wait for my answer from Biostar) If you have purchased one of these cards... tell and when you get it the card does not work, contact Biostar and demand a BIOS update.
June 7, 2006 12:08:34 AM

Could the character on the box look any stupider?
June 7, 2006 2:50:16 AM

Quote:
Could the character on the box look any stupider?


hahah i know. looks like a character from a bad episode of Mighty Morphin Power Rangers...or a TMNT reject...BeBop's cousin perhaps? :lol: 
June 7, 2006 2:05:34 PM

Thanks alot Darren for checking on that. I will send word to Biostar and see if its compatiable and if not I will demand they test it then! LOL

Thanks again Darren.
June 7, 2006 3:19:38 PM

Okay, this is cool but the 7950GX2 cores have been slowed down to not much more the the 7900GT, a whopping 200MHz less than the 7900GTX.

As the header to this image says, "Nvidia keeps the core and memory running a bit slower than the GeForce 7900GTX due to heat issues."

I believe Mr. Polkowski made a slighter under-statement with the above quoted text from his article.

Here is the spec list:
http://images.tomshardware.com/2006/06/05/geforce_7950_...

I think this is a complete waste of money if you can't run the cores at least the same clocks as the 7900GTX. If heat wasn't an issue, those cores should run at least at 7900GTX speed and it would blow the ATI out of the water completly.

And the heat sinks look really weak also, no wonder they have to "under-clock" the cores.

Lame... Like having an 8-cyclinder engine running on only 6cyls.

Is there anyway you guys could run this test on a water-cooled 7950GX2 at full clock?

-AM
June 7, 2006 3:43:25 PM

I have to say I’m a bit disappointed. I was expecting the card to be a dual core GPU (not 2x GPU but 2 cores onto 1 chip like Dual core Pentiums) chip and what we got was basically 2 cards strapped together.

The benchies look fairly impressive but not mind blowing, I think I will be waiting for a few more years for this technology to mature before purchasing it.
!