Sign in with
Sign up | Sign in
Your question

x1950pro vs. x1950xt

Last response: in Graphics & Displays
Share
August 8, 2007 6:02:20 PM

I am considering getting a new card soon because I am getting impatient with the current generation midrange cards. My setup is:

1 320gb HD (not sure of the model)
2gb (2x1gb) OCZ 800MHz DDR2 ram
Pentium 4 3 Ghz single core w/ hyperthreading
hauppauge Wintv 1600
x300se
soundblaster xfi card
350W power supply (not sure about the rails)

1. If I would want to upgrade to one of these cards, will my PSU be able to support it?
2. Is one of these chipsets significantly more power efficient?
3. I know that the new chips (HD2X00) are better at hardware decoding of video, but do these chips help lessen the load on the CPU significantly due to AVIVO?

More about : x1950pro x1950xt

August 8, 2007 7:15:33 PM

1. You will need a better PSU to support either cards.
2. The x1950pro consumes far less power than the xt but it is still quite a bit.
3. Not sure about that one

You Pentium 4 3 Ghz processor will bottleneck both cards though.
a b U Graphics card
August 8, 2007 7:20:37 PM

Had the same question (1950 Pro vs 1950 XT). From responses, XT higer performance, Pro - lower performance, slightly less power consumption, and slightly lower temps. I went for the 1950XT.

Ref PSU: NO, a 350 watt PS is not cut it. Some have indicated they have it running on arround 450 Watt PS. BUT since you have to get a new PS, I would recommend a good 550 -> 650 Watt PSU. It is overkill for your current system, But will prevent having to Buy again when you upgrade.

For what it is worth (or Sh__ and grin), the increase in power consumption going from idle to gaming is an increase of 110 Watt for the x1950XT
Related resources
August 8, 2007 7:25:47 PM

Wow I didn't realize the power consumption numbers were that high. Thanks for the info.
a b U Graphics card
August 8, 2007 7:49:18 PM

Forgot to factor out PSU eff. Make that an increase of 85 to 90 watts
when going from Idle to Gaming (Approx a 7.5 Amp increase in +12 V current)
August 8, 2007 7:51:43 PM

you might be able to get away with putting a 8600GTS in there but no way any x1950 is going to run reliably on a 350W supply. It will likely run for you, but I guarantee it will BSOD left and right.

The 8600GTS is pretty even with a x1950pro, and is a lot less powerhungry.
August 8, 2007 7:52:42 PM

This is somewhat OT on my own thread, but is a 650W PSU going to cover just about any reasonable PC assuming 1 GPU, 1 cpu, 2-3 PCI cards (sound, TV capture, etc.) and maybe 2-3 HDD?
August 8, 2007 8:06:28 PM

shadowmaster625 said:
but no way any x1950 is going to run reliably on a 350W supply. It will likely run for you, but I guarantee it will BSOD left and right.


maybe...maybe not depends on the make and quality of the psu. I have my x1950pro running on a 380watt supply with just 22amps on the +12V rail (even though suggested is 30). you can see the rest of my setup by clicking the little computer sign on top of my post

prodystopian said:
This is somewhat OT on my own thread, but is a 650W PSU going to cover just about any reasonable PC assuming 1 GPU, 1 cpu, 2-3 PCI cards (sound, TV capture, etc.) and maybe 2-3 HDD?


yes a good 650 watt psu will do you just fine for now and for the future.

a b U Graphics card
August 8, 2007 8:27:08 PM

My system C2D E6400 @ 3.2 MHz, 2 Gis DDR2800, X1950XT, 4 Sata HDDs, and 2 DVD Writers.
@ Idle 1.6 A x 120V = 192 Watts
with 3DMarks06 running 2.6 Amps x 120 V = 312 Watts

This would lead one to think a 350 would work (Will not), I had a 600 watt igreen which would work, but only provide 8 amps to 6 Pin PCI-e connector. This was not suffient to raising the clock even 20 MHz.

IF you are not planning on running SLI a "Good" 600, or 650 Watt - Look at 12V rails - will work just fine.

Ref: shadowmast - Agree 8600 GTS is only 10% lower than 1950Pro, Plus as you stated less power drain - plus also it is DX10 - a Plus for vista.
August 11, 2007 11:07:20 PM

I just bought the Gecube X1950XT to max out my system before I have to upgrade the motherboard to PCI-e.

Unfortunately I can't get it to work. I get the blue screen instead of the Vista logo when I boot up. Boots fine in safe mode.

Anyone got any ideas why?

System:
Motherboard: Gigagyte K8 Triton GA-K8NSC-939
Processor: AMD 64 Dual Core 4200
RAM: 4GB
PSU 1200W

I'm wondering if it's the nForce3 onboard graphics causing the trouble.
August 12, 2007 1:24:31 AM

Idlehands said:
I just bought the Gecube X1950XT to max out my system before I have to upgrade the motherboard to PCI-e.

Unfortunately I can't get it to work. I get the blue screen instead of the Vista logo when I boot up. Boots fine in safe mode.

Anyone got any ideas why?

System:
Motherboard: Gigagyte K8 Triton GA-K8NSC-939
Processor: AMD 64 Dual Core 4200
RAM: 4GB
PSU 1200W

I'm wondering if it's the nForce3 onboard graphics causing the trouble.


a 1200W PSU? Woah!

The onboard graphics could be causing problems, make sure that they are disabled in the bios and that you have uninstalled any related drivers before installing the X1950XT.
August 12, 2007 9:43:33 AM

turboflame said:
a 1200W PSU? Woah!

The onboard graphics could be causing problems, make sure that they are disabled in the bios and that you have uninstalled any related drivers before installing the X1950XT.


I was having PSU problems when I upgraded to Vista so decided to go the full hog on power to avoid having to upgade again any time soon. As the best one's were £200+ I was a bit reluctant until I stumbled across this:

http://www.3dgameman.com/content/view/8956/103/

Best bit of IT equipment I've ever bought - especially as my old case made the noise of a small jet! With the PSUs included it became a bargain that killed two birds with one stone - and it's the first bit of kit that ever exceded my expectations.

The X1950XT fits in no trouble - it could take one twice as large.

Can't find a way to turn off the onboard graphics on my motherboard. The instructions claim it automatically detects an external card and turns itself off. Doesn't appear in device manager to manually turn it off either.

I'm stuck.



a b U Graphics card
August 12, 2007 10:10:28 AM

RetiredChief said:
Forgot to factor out PSU eff. Make that an increase of 85 to 90 watts
when going from Idle to Gaming (Approx a 7.5 Amp increase in +12 V current)

??? PSU efficiency affects what it pulls out of the wall socket, it has nothing to do with how much power it outputs. A PSU with 10% efficiency that has a 200W power draw on it will still supply just 200W (assuming it has the capacity to do so) but will pull about 2kW from the wall socket (and burn up in seconds from the wasted heat, but thats beside the point).

EDIT: Fixed typos
a b U Graphics card
August 12, 2007 11:26:20 AM

To the OP : 3. Yes the 2xxx series will lesson the draw on the cpu, as will the 8xxx series
August 12, 2007 1:20:27 PM

You need a PCI-E 6pin cable to go into teh XT and probably the PRO anyway, to provide enough power to them.
August 12, 2007 2:12:20 PM

Hatman said:
You need a PCI-E 6pin cable to go into teh XT and probably the PRO anyway, to provide enough power to them.



It comes with an adaptor to plug in 2 molex to the 6 pin slot. You need a third molex to power up the loose molex connector too.
a b U Graphics card
August 12, 2007 2:56:37 PM

Ramdomizer. PS eff relates to losses in the PS.
Using a eff of 80% and a Current draw, at the outlet, of 1.6 Amps:
1.6 Amps X 120 V = 192 Watts (Total Power)
192 x 0.2 = 38.4 Watts consummed by the PS
192 x 0.8 = 153.6 W delievered to load
Then if Mains I increases to 2.6 Amps:
2.6 A X 120V = 312 W (Total Power)
312W x 0.2 = 62.4 W (PS losses)
312 W x 0.8 = 249.6 W for Load

Net change for PS and Load is 312W- 192 W = 120 W.
Net change for load is 249.6W - 153.6 W = 96 W
August 12, 2007 3:37:40 PM

shadowmaster625 said:
you might be able to get away with putting a 8600GTS in there but no way any x1950 is going to run reliably on a 350W supply. It will likely run for you, but I guarantee it will BSOD left and right.

The 8600GTS is pretty even with a x1950pro, and is a lot less powerhungry.


An 8600GTS is not pretty even with an X1950Pro, it trails it significantly in just about everything and more so at higher resolutions.
August 12, 2007 3:39:03 PM

It turns out that if you want the X1950XT to run in Vista then it must be on a motherboard on Microsoft's Vista Hardware Compatibility List.

https://support.ati.com/ics/support/default.asp?deptID=...

Mine isn't

Are they just slow in bringing out new drivers? I can't find any related to Vista on the Gigabyte site - or am i just screwed !?!?
a b U Graphics card
August 12, 2007 11:20:23 PM

RetiredChief said:
Ramdomizer. PS eff relates to losses in the PS.
Using a eff of 80% and a Current draw, at the outlet, of 1.6 Amps:
1.6 Amps X 120 V = 192 Watts (Total Power)
192 x 0.2 = 38.4 Watts consummed by the PS
192 x 0.8 = 153.6 W delievered to load
Then if Mains I increases to 2.6 Amps:
2.6 A X 120V = 312 W (Total Power)
312W x 0.2 = 62.4 W (PS losses)
312 W x 0.8 = 249.6 W for Load

Net change for PS and Load is 312W- 192 W = 120 W.
Net change for load is 249.6W - 153.6 W = 96 W

I disagree. If that were the case, you would constantly underpower everything. The PSU HAS to supply what is required (even cheap crappy ones will for a short time until *poof*), but to offset the inefficiencies of current circuit designs, it has to draw more power from the outlet to compensate.
a b U Graphics card
August 13, 2007 2:26:10 AM

Randomizer.
I think you are misinterpreting what I have said. Power supplies have an effecience rating which is simply Power used by load (ie computer) divided by Total power. Note in my example 249.6 Watts (computer Power consumed / 312 Watt (Input power to PSU) = .8 (80%) which is the Efficency rating. I used a PSU with 80 %. You are correct that a PSU will provide what is need ( As long as max ratings are not exceeded), What changes is input power as a function of eff.
a b U Graphics card
August 13, 2007 2:27:59 AM

Sorry, yes I did misinterpret it. I agree with you now :D 
a b U Graphics card
August 13, 2007 2:45:53 AM

No problem, I probably should have stated it better. Now I have a problem. What I Measured using a fluke multimeter (expensive model) does not jive with what ATI tools reflects as the change in GPU current going from idle to displaying 3D image. I trust my measurements and calculation vs an unknown algorithm. I don't think I want to cut my 3 +12V rails an insert a current meter to find the answer.
a b U Graphics card
August 13, 2007 2:47:33 AM

We are talking about software here, its never that accurate.
a b U Graphics card
August 13, 2007 2:54:01 AM

You have that right! Although I'm an elect Tech, I have had to write some programs. Wrote one for converting 24 K-type thermistors to display temperatures an a Space Flight unit undergoing Thermal Vacumn testing.
!