redmanc

Distinguished
Mar 11, 2006
59
0
18,630
Ok everybody is going on about DX10, is it really such a big thing ??

I'm looking at maybe updating my PC when Conroe is available, which i believe is very soon, and with it my GC.

I was looking at buying a 7900GT 256mb, but i'm not sure the way everybody is going on about DX10.

1. When is DX10 suppose to be coming out ??
2. Would a 7900GT card not be able to play DX10 stuff, with a bios/driver update, surely it must.
3. When should i update my rig, not bothered if its 2,6 or 9 months ?
 

dmdallas

Distinguished
Apr 2, 2006
59
0
18,630
Conroe is coming out on the 27th. It is a cpu that is most definatly worth upgrading for (I know I am going to), but DX 10 cards probably won't be out until vista comes out (the only OS that will be able to use DX 10 fully).Direct X 10 is supposed to be revolutionary, though I don't know how "revolutionary" it really be, but if I were you, I would still buy a new conroe rig now and hold off on a video card until this next generation of video cards come out and if they really arent worth it then at least the current gen cards will be alot cheaper, unless your current video card is really dated i.e. older than an Nvidia 6600 series or an ATI x800 series then upgrade to something cheap like an Nvidia 7300-7600 and im not too familiar with current mid range ATI gpu's.
 

elpresidente2075

Distinguished
May 29, 2006
851
0
18,980
Even if it is revolutionary, nothing will be able to take full advantage of it for some time. I mean, look at now: it has been three generations since DX9 has come out, and only in the last year, maybe less has this really been utilized. I say if you can, wait until vista and the DX10 cards actually do come out before you buy. Then buy a current generation card.

This course of action will be great, because the current crop of DX9 cards kicks butt and will be, like dmdallas said, much cheaper. Not to mention, most games that do use DX10 will have to have DX9 backwards compatibility, otherwise they would be totally alienating most of their user bases, I.E. everyone who can't afford/doesn't want to upgrade to vista.

Short answer: Wait 9 months, then make your decision. Research everything in the time being.
 

prozac26

Distinguished
May 9, 2005
2,808
0
20,780
1. DX10 is Vista only. It'll be available when Vista comes out.
2. 7900GT will run DX10 fine. 7900GT will not be able to use some features that DX10 offers.
3. You can get a Conroe, with 2gb of RAM, and a good video card. And with this, you're set until games need Vista and DX10. Which could be a year, two, or couple of months from now.
 

s3anister

Distinguished
May 18, 2006
679
2
19,060
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.

laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher
 

Adamant

Distinguished
Mar 22, 2006
14
0
18,510
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.

laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher

Amps and watts go hand in hand. If the psu has more amps then the watts will be higher as well. You are correct in saying that the next gen won't require a 1000 watt psu for a single card, but for two next gen cards it could come close to that if they are doing something like dual pcb or dual core. To get a rough idea of how many amps a psu has you divide the watts by 12. Of course it is most important that most of the amps and watts be on the 12v line.

Cheers.
 

K1LLTACULAR

Distinguished
Jul 12, 2006
31
0
18,530
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.

laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher

Amps and watts go hand in hand. If the psu has more amps then the watts will be higher as well. You are correct in saying that the next gen won't require a 1000 watt psu for a single card, but for two next gen cards it could come close to that if they are doing something like dual pcb or dual core. To get a rough idea of how many amps a psu has you divide the watts by 12. Of course it is most important that most of the amps and watts be on the 12v line.

Cheers.

Might want to pay more attention in physics as watts and amps are not directly proportional. Watts = Amps x Volts so an increase in voltage will yield an increase in wattage but not necessarily an increase in amperes.

If an electrical system is 3,000,000 watts and 3,000,000 volts it means that it is only one ampere. 6 Watts and 3 Volts would reveal only half an amp.
 

Adamant

Distinguished
Mar 22, 2006
14
0
18,510
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.

laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher

Amps and watts go hand in hand. If the psu has more amps then the watts will be higher as well. You are correct in saying that the next gen won't require a 1000 watt psu for a single card, but for two next gen cards it could come close to that if they are doing something like dual pcb or dual core. To get a rough idea of how many amps a psu has you divide the watts by 12. Of course it is most important that most of the amps and watts be on the 12v line.

Cheers.

Might want to pay more attention in physics as watts and amps are not directly proportional. Watts = Amps x Volts so an increase in voltage will yield an increase in wattage but not necessarily an increase in amperes.

If an electrical system is 3,000,000 watts and 3,000,000 volts it means that it is only one ampere. 6 Watts and 3 Volts would reveal only half an amp.

I said to get a rough idea of how many amps it had. When it comes to psu's you can basically divide the watts on the 12v rail by 12 and figure out how many amps are on it. Wasn't trying to get into the entire formula because it isn't necessary when it comes to psu's.
 

Cheese

Distinguished
Jan 15, 2005
122
0
18,680
I am an alien from the future. 3 years from now the 9000 series nvidia cards will come out along with Vista (it will be delayed Dec. 2006).

The 9000 series will require 5000W PSU and will only fit in AGP2 slots. Also, they support DirectX 9, 10, and 10.5
They will also weigh 50lbs and require rails to hold them in the case.

AMD will be bought out by VIA and VIA will release Cyrix processors again. Intel will change their name to 3Dfx and release Voodoo CPUs. Why? Because they can.

After that Israel will fire missles into every country on earth because they can and World War III will start.

Linux will buy out Microsoft after Seattle is atomized.

There. NOW STFU ABOUT DX10
 

redmanc

Distinguished
Mar 11, 2006
59
0
18,630
OK, this is going way off topic, all i want is a straight answer to my 3 questions.

I'm not interested in Watts, Amps or bloody 2010. I'm talking about the next few months.
 

Cheese

Distinguished
Jan 15, 2005
122
0
18,680
How dare you use the word "bloody" to me. Cussing is not allowed on these forums you asshole.

I DEMAND your account be banned immediately as I am emailing all the moderators as we speak. I am also going to use the Death Star on your home planet AND write a letter to my Galactic Congressman.

I demand immediate justice!

BTW DX10 is gonna rock.
 

Fagaru

Distinguished
Jun 15, 2006
238
0
18,680
why? i am sick hearing conroe... "i have an fx 55, but im gonna get a n E6300 when conroe comes out and im hoping it will be better...i have more crap than brains"...stupid n00bs
 

qwazzy

Distinguished
Jun 27, 2006
649
0
18,990
I am an alien from the future. 3 years from now the 9000 series nvidia cards will come out along with Vista (it will be delayed Dec. 2006).

The 9000 series will require 5000W PSU and will only fit in AGP2 slots. Also, they support DirectX 9, 10, and 10.5
They will also weigh 50lbs and require rails to hold them in the case.

AMD will be bought out by VIA and VIA will release Cyrix processors again. Intel will change their name to 3Dfx and release Voodoo CPUs. Why? Because they can.

After that Israel will fire missles into every country on earth because they can and World War III will start.

Linux will buy out Microsoft after Seattle is atomized.

There. NOW STFU ABOUT DX10

Hehe. I live in seattle. lol
 
1. When is DX10 suppose to be coming out ??

When Windows Vista comes out. DX10 will not be supported in Windows XP.

2. Would a 7900GT card not be able to play DX10 stuff, with a bios/driver update, surely it must.

Any current DX9 card should be able to play a DX10 game because those games will also be compatible with DX9. It will be a few years before DX9 support is dropped. Support for DX8.1 has only recently started to drop. Oblivion is a prime example, you must have a DX9 card to play (but there are hacks to get a DX8.1 card to play Oblivion).

But other recent games still supports DX7 cards. Star Wars Empire At War is an example. I loaded the demo onto my IBM T40 laptop with an integrated Radeon 7500 IGP just for the hell of it. It is definitely playable, but looks better on a DX9 GPU (naturally).

Support for DX9 cards will begin to die until DX11 becomes available or is announced. Therefore, a 7900GT or X1900XT will last you a few years, or until you decide they are too slow for your "needs".

Getting new hardware is great, and sometimes it's not. nVidia GeForce FX series was the first DX9 cards released. However, they turned out to be really, really bad at DX9 games. Basically the GeForce FX series was the worst product ever inflicted onto gamers. I think the Radeon 9700 had some intial problems too, but that could have just been rumors.

3. When should i update my rig, not bothered if its 2,6 or 9 months ?

Update your rig whenever you think the performance is no longer good enough for you. Upgrade to either the AM2 Athlon or Conroe. Conroe performs better, but if you hate Intel then go for Athlon. The longer you wait the more likely a more powerful CPU model will come out. If you can wait until Q3 2007, then maybe by then the Athlon K8L desktop CPU will come out. The 45nm Conroe CPUs should be out by Q3 2007, until Intel runs into problems.
 

redmanc

Distinguished
Mar 11, 2006
59
0
18,630
Thanks for your comments, i am an intel man and do prefer them to AMD's. I have a P4 2.53ghz chip a the moment OC to 3.0ghz, so yes i need to upgrade. The only reason i'm waiting for Conroe is to see what the price is, if its to unreasonable then i'll be getting a Pentinum D which i presume will drop in price. I've decided i'm going to wait until November time (get a pay rise). At this point i'll see what i can buy at the best price and we'll have more info regards to Vista and whats recommended to run it.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Ok everybody is going on about DX10, is it really such a big thing ??

I'm looking at maybe updating my PC when Conroe is available, which i believe is very soon, and with it my GC.

I was looking at buying a 7900GT 256mb, but i'm not sure the way everybody is going on about DX10.

1. When is DX10 suppose to be coming out ??
2. Would a 7900GT card not be able to play DX10 stuff, with a bios/driver update, surely it must.
3. When should i update my rig, not bothered if its 2,6 or 9 months ?
  • [*:92c667af57]DirectX 10, as mentioned, will come out with the release of Windows Vista, project, at this point, to be sometime in the spring of 2007. (I think) it will not be compatible with Windows XP, of any form.
    [*:92c667af57]Compatibility with DirectX 10 would be just as with older versions of DirectX; DX 8 cards like the GeForce 4 Ti still work just fine, they just don't provide support for DirectX 9 features, namely Shader Model 2.0 or 3.0. In this way, a 7900GT will work fine, with no effort, in Vista/DX 10. (it does already with the beta versions) It will work just as before with all DirectX 9.0c and earlier games, as well as with some DirectX 10 games. It will, however, not work with any settings that employ the GRAPHICAL part of DirectX 10, namely Shader Model 4.0. This would be just like the GeForce 4 Ti, a DirectX 8.0/SM 1.3 part, not working with Oblivion, which uses DirectX 9/SM 2.0 for graphics. Thus far, though, the only planned DirectX-10 graphics game is the PC port of Halo 2.
    [*:92c667af57]That would depend all on what you plan on doing with it. As mentioned, Halo 2 for PC is pretty much the only game that will have use for a DirectX 10 card in 2007, and possibly even in 2008. And in most cases, one could very likely live without DirectX 10 graphics support for years, just like one can live without DirectX 9.0c cards today. (and hence HDR support in many games) Personally, I might not upgrade now, but perhaps early 2007, when there's a clearer view of Vista. (pun intended)
    personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

    Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

    p.s. EVGA and XFX make good nvidia cards.
    A 1,000 watt PSU won't be required, much to the dissapointment of PC Power & Cooling makers of what appears to be the world's only ATX/BTX-compatible 1-kilowat PSU.

    Single GPUs won't require anything truly different when it comes to power; about 100-120 watts for a top-of-the-line GPU. Of course, dual-GPU boards are another matter, and using two of those at once...

    Let's also not forget that there will be more than absurdly expensive DX 10 cards out. I'd expect both ATi and nVidia to hold off their next series until they can get DX 10 support. So it would be the GeForce 8/Radeon X2000, or possibly the GeForce 9/Radeon X3000 that bring support. And you can rest assured that there will be "600" (mid-range) and even "300" (low-end) variations. After all, nVidia's sold more 6600GTs than any other card in recent memory.
    laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher
    How can the amp requirements raise without the wattage requirements? Since video cards have moved to using the +12v rail almost exclusively, (instead of the +5v rail) I highly doubt they'd reverse direction and use a lower voltage, but higher amperage.
 

theaxemaster

Distinguished
Feb 23, 2006
375
0
18,780
I said to get a rough idea of how many amps it had. When it comes to psu's you can basically divide the watts on the 12v rail by 12 and figure out how many amps are on it. Wasn't trying to get into the entire formula because it isn't necessary when it comes to psu's.

Not commenting on your math, just your logic. Most 5v rails I've seen run somewhere around 30-35 amps, so you're discounting a whole 150+ watts in your equation.

That said, both ATi and nVidia have said that the upcoming generation of chips will be the most power-hungry cards to date, and these will be the "Vista" cards. I don't know about you, but I don't want to have to have a 750w PS just for that. DX10 full compliance won't be necessary for at least a year unless you just want Halo 2. So get the 7900GT now, and upgrade to the gen after the G80/Rxxx that's coming in the next few months.
 

redmanc

Distinguished
Mar 11, 2006
59
0
18,630
Ok i was asking about GC but its not just about upgarding that, i'm upgrading my whole rig:Mobo,CPU,GC and Mem.

Like i say theres major developments regarding Vista, DX10 and new CPU's. I just want people opinions being on when best to do it, I'll think i'll be getting the 7900Gt card as its god value, but its a matter of when ?

Like i said November may be a good time for me !
 

ethel

Distinguished
May 20, 2006
1,130
0
19,290
If your current card is not good enough to play the games you want to as prettily as you'd like, buy a gfx card now - Vista is gonna be at least Feb 2007 - what's the point of waiting that long?

IMO there's no point in waiting for the Next Big Thing with PCs, unless it's just around the corner (like Conroe).
 
It will, however, not work with any settings that employ the GRAPHICAL part of DirectX 10, namely Shader Model 4.0. This would be just like the GeForce 4 Ti, a DirectX 8.0/SM 1.3 part, not working with Oblivion, which uses DirectX 9/SM 2.0 for graphics. Thus far, though, the only planned DirectX-10 graphics game is the PC port of Halo 2.

However the benifits of Crysis may e enough to make the DX10 bonuses worth it. Not required, but definitely enjoyed. I thnk Halo2 may be the only DX10-only app (by design) until about 2008. Even UT2K7 will have a DX9 fp24 fallback supposedly, so not even just DX9.0C, but about the same level of fallback as Oblivion (although I suspect it might play worse on an R9600 or X700 than Oblivion).


That would depend all on what you plan on doing with it. As mentioned, Halo 2 for PC is pretty much the only game that will have use for a DirectX 10 card in 2007, and possibly even in 2008.

Use and need are different animals, like I mentioned UT2K7 and Crysis will have a use for it, just not a 'NEED'.

How can the amp requirements raise without the wattage requirements? Since video cards have moved to using the +12v rail almost exclusively, (instead of the +5v rail) I highly doubt they'd reverse direction and use a lower voltage, but higher amperage.

I agree, if anyting they'd move to those wall socket solution we've seen from ASUS.
 

Adamant

Distinguished
Mar 22, 2006
14
0
18,510
I said to get a rough idea of how many amps it had. When it comes to psu's you can basically divide the watts on the 12v rail by 12 and figure out how many amps are on it. Wasn't trying to get into the entire formula because it isn't necessary when it comes to psu's.

Not commenting on your math, just your logic. Most 5v rails I've seen run somewhere around 30-35 amps, so you're discounting a whole 150+ watts in your equation.

That said, both ATi and nVidia have said that the upcoming generation of chips will be the most power-hungry cards to date, and these will be the "Vista" cards. I don't know about you, but I don't want to have to have a 750w PS just for that. DX10 full compliance won't be necessary for at least a year unless you just want Halo 2. So get the 7900GT now, and upgrade to the gen after the G80/Rxxx that's coming in the next few months.

My logic is as follows graphics cards draw their current from the 12v rail. So when it comes to graphics cards I don't care if the 5v rail has 1000 watts on it. I was pointing out the watts and amps that matters when it comes to graphics cards.
 

clue69less

Splendid
Mar 2, 2006
3,622
0
22,780
Use and need are different animals, like I mentioned UT2K7 and Crysis will have a use for it, just not a 'NEED'.

My problem is that my Need Dog keeps trying to overeat and then my Use Dog has to work all night to digest it all. Then I get to clean up all the dog poop.