Sign in with
Sign up | Sign in
Your question

DX10 !

Last response: in Graphics & Displays
Share
July 15, 2006 10:36:12 PM

Ok everybody is going on about DX10, is it really such a big thing ??

I'm looking at maybe updating my PC when Conroe is available, which i believe is very soon, and with it my GC.

I was looking at buying a 7900GT 256mb, but i'm not sure the way everybody is going on about DX10.

1. When is DX10 suppose to be coming out ??
2. Would a 7900GT card not be able to play DX10 stuff, with a bios/driver update, surely it must.
3. When should i update my rig, not bothered if its 2,6 or 9 months ?

More about : dx10

July 15, 2006 10:58:27 PM

Conroe is coming out on the 27th. It is a cpu that is most definatly worth upgrading for (I know I am going to), but DX 10 cards probably won't be out until vista comes out (the only OS that will be able to use DX 10 fully).Direct X 10 is supposed to be revolutionary, though I don't know how "revolutionary" it really be, but if I were you, I would still buy a new conroe rig now and hold off on a video card until this next generation of video cards come out and if they really arent worth it then at least the current gen cards will be alot cheaper, unless your current video card is really dated i.e. older than an Nvidia 6600 series or an ATI x800 series then upgrade to something cheap like an Nvidia 7300-7600 and im not too familiar with current mid range ATI gpu's.
July 15, 2006 11:10:20 PM

Even if it is revolutionary, nothing will be able to take full advantage of it for some time. I mean, look at now: it has been three generations since DX9 has come out, and only in the last year, maybe less has this really been utilized. I say if you can, wait until vista and the DX10 cards actually do come out before you buy. Then buy a current generation card.

This course of action will be great, because the current crop of DX9 cards kicks butt and will be, like dmdallas said, much cheaper. Not to mention, most games that do use DX10 will have to have DX9 backwards compatibility, otherwise they would be totally alienating most of their user bases, I.E. everyone who can't afford/doesn't want to upgrade to vista.

Short answer: Wait 9 months, then make your decision. Research everything in the time being.
Related resources
July 16, 2006 12:56:18 AM

1. DX10 is Vista only. It'll be available when Vista comes out.
2. 7900GT will run DX10 fine. 7900GT will not be able to use some features that DX10 offers.
3. You can get a Conroe, with 2gb of RAM, and a good video card. And with this, you're set until games need Vista and DX10. Which could be a year, two, or couple of months from now.
a b U Graphics card
July 16, 2006 2:19:29 AM

personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.
July 16, 2006 3:03:33 AM

Quote:
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.


laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher
July 16, 2006 3:31:01 AM

The new Direct X10 cards should consume no more power than current SLI and Crossfire setups.
July 16, 2006 3:33:16 AM

Quote:
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.


laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher

Amps and watts go hand in hand. If the psu has more amps then the watts will be higher as well. You are correct in saying that the next gen won't require a 1000 watt psu for a single card, but for two next gen cards it could come close to that if they are doing something like dual pcb or dual core. To get a rough idea of how many amps a psu has you divide the watts by 12. Of course it is most important that most of the amps and watts be on the 12v line.

Cheers.
July 16, 2006 4:07:49 AM

Quote:
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.


laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher

Amps and watts go hand in hand. If the psu has more amps then the watts will be higher as well. You are correct in saying that the next gen won't require a 1000 watt psu for a single card, but for two next gen cards it could come close to that if they are doing something like dual pcb or dual core. To get a rough idea of how many amps a psu has you divide the watts by 12. Of course it is most important that most of the amps and watts be on the 12v line.

Cheers.

Might want to pay more attention in physics as watts and amps are not directly proportional. Watts = Amps x Volts so an increase in voltage will yield an increase in wattage but not necessarily an increase in amperes.

If an electrical system is 3,000,000 watts and 3,000,000 volts it means that it is only one ampere. 6 Watts and 3 Volts would reveal only half an amp.
July 16, 2006 4:38:23 AM

It boils down to this....when there are killer DX10 games available, we'll all buy DX10 hardware, until then, these threads need to die....

kthx
July 16, 2006 4:49:42 AM

Quote:
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.


laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher

Amps and watts go hand in hand. If the psu has more amps then the watts will be higher as well. You are correct in saying that the next gen won't require a 1000 watt psu for a single card, but for two next gen cards it could come close to that if they are doing something like dual pcb or dual core. To get a rough idea of how many amps a psu has you divide the watts by 12. Of course it is most important that most of the amps and watts be on the 12v line.

Cheers.

Might want to pay more attention in physics as watts and amps are not directly proportional. Watts = Amps x Volts so an increase in voltage will yield an increase in wattage but not necessarily an increase in amperes.

If an electrical system is 3,000,000 watts and 3,000,000 volts it means that it is only one ampere. 6 Watts and 3 Volts would reveal only half an amp.

I said to get a rough idea of how many amps it had. When it comes to psu's you can basically divide the watts on the 12v rail by 12 and figure out how many amps are on it. Wasn't trying to get into the entire formula because it isn't necessary when it comes to psu's.
July 16, 2006 12:05:45 PM

I am an alien from the future. 3 years from now the 9000 series nvidia cards will come out along with Vista (it will be delayed Dec. 2006).

The 9000 series will require 5000W PSU and will only fit in AGP2 slots. Also, they support DirectX 9, 10, and 10.5
They will also weigh 50lbs and require rails to hold them in the case.

AMD will be bought out by VIA and VIA will release Cyrix processors again. Intel will change their name to 3Dfx and release Voodoo CPUs. Why? Because they can.

After that Israel will fire missles into every country on earth because they can and World War III will start.

Linux will buy out Microsoft after Seattle is atomized.

There. NOW STFU ABOUT DX10
July 16, 2006 1:31:43 PM

OK, this is going way off topic, all i want is a straight answer to my 3 questions.

I'm not interested in Watts, Amps or bloody 2010. I'm talking about the next few months.
July 16, 2006 1:40:54 PM

How dare you use the word "bloody" to me. Cussing is not allowed on these forums you asshole.

I DEMAND your account be banned immediately as I am emailing all the moderators as we speak. I am also going to use the Death Star on your home planet AND write a letter to my Galactic Congressman.

I demand immediate justice!

BTW DX10 is gonna rock.
July 17, 2006 2:13:41 AM

why? i am sick hearing conroe... "i have an fx 55, but im gonna get a n E6300 when conroe comes out and im hoping it will be better...i have more crap than brains"...stupid n00bs
July 17, 2006 5:52:11 PM

Quote:
I am an alien from the future. 3 years from now the 9000 series nvidia cards will come out along with Vista (it will be delayed Dec. 2006).

The 9000 series will require 5000W PSU and will only fit in AGP2 slots. Also, they support DirectX 9, 10, and 10.5
They will also weigh 50lbs and require rails to hold them in the case.

AMD will be bought out by VIA and VIA will release Cyrix processors again. Intel will change their name to 3Dfx and release Voodoo CPUs. Why? Because they can.

After that Israel will fire missles into every country on earth because they can and World War III will start.

Linux will buy out Microsoft after Seattle is atomized.

There. NOW STFU ABOUT DX10


Hehe. I live in seattle. lol
a c 361 U Graphics card
July 17, 2006 7:21:11 PM

Quote:

1. When is DX10 suppose to be coming out ??


When Windows Vista comes out. DX10 will not be supported in Windows XP.

Quote:

2. Would a 7900GT card not be able to play DX10 stuff, with a bios/driver update, surely it must.


Any current DX9 card should be able to play a DX10 game because those games will also be compatible with DX9. It will be a few years before DX9 support is dropped. Support for DX8.1 has only recently started to drop. Oblivion is a prime example, you must have a DX9 card to play (but there are hacks to get a DX8.1 card to play Oblivion).

But other recent games still supports DX7 cards. Star Wars Empire At War is an example. I loaded the demo onto my IBM T40 laptop with an integrated Radeon 7500 IGP just for the hell of it. It is definitely playable, but looks better on a DX9 GPU (naturally).

Support for DX9 cards will begin to die until DX11 becomes available or is announced. Therefore, a 7900GT or X1900XT will last you a few years, or until you decide they are too slow for your "needs".

Getting new hardware is great, and sometimes it's not. nVidia GeForce FX series was the first DX9 cards released. However, they turned out to be really, really bad at DX9 games. Basically the GeForce FX series was the worst product ever inflicted onto gamers. I think the Radeon 9700 had some intial problems too, but that could have just been rumors.

Quote:

3. When should i update my rig, not bothered if its 2,6 or 9 months ?


Update your rig whenever you think the performance is no longer good enough for you. Upgrade to either the AM2 Athlon or Conroe. Conroe performs better, but if you hate Intel then go for Athlon. The longer you wait the more likely a more powerful CPU model will come out. If you can wait until Q3 2007, then maybe by then the Athlon K8L desktop CPU will come out. The 45nm Conroe CPUs should be out by Q3 2007, until Intel runs into problems.
July 17, 2006 9:27:01 PM

Thanks for your comments, i am an intel man and do prefer them to AMD's. I have a P4 2.53ghz chip a the moment OC to 3.0ghz, so yes i need to upgrade. The only reason i'm waiting for Conroe is to see what the price is, if its to unreasonable then i'll be getting a Pentinum D which i presume will drop in price. I've decided i'm going to wait until November time (get a pay rise). At this point i'll see what i can buy at the best price and we'll have more info regards to Vista and whats recommended to run it.
July 17, 2006 9:49:51 PM

Quote:
Ok everybody is going on about DX10, is it really such a big thing ??

I'm looking at maybe updating my PC when Conroe is available, which i believe is very soon, and with it my GC.

I was looking at buying a 7900GT 256mb, but i'm not sure the way everybody is going on about DX10.

1. When is DX10 suppose to be coming out ??
2. Would a 7900GT card not be able to play DX10 stuff, with a bios/driver update, surely it must.
3. When should i update my rig, not bothered if its 2,6 or 9 months ?
    [*:92c667af57]DirectX 10, as mentioned, will come out with the release of Windows Vista, project, at this point, to be sometime in the spring of 2007. (I think) it will not be compatible with Windows XP, of any form.
    [*:92c667af57]Compatibility with DirectX 10 would be just as with older versions of DirectX; DX 8 cards like the GeForce 4 Ti still work just fine, they just don't provide support for DirectX 9 features, namely Shader Model 2.0 or 3.0. In this way, a 7900GT will work fine, with no effort, in Vista/DX 10. (it does already with the beta versions) It will work just as before with all DirectX 9.0c and earlier games, as well as with some DirectX 10 games. It will, however, not work with any settings that employ the GRAPHICAL part of DirectX 10, namely Shader Model 4.0. This would be just like the GeForce 4 Ti, a DirectX 8.0/SM 1.3 part, not working with Oblivion, which uses DirectX 9/SM 2.0 for graphics. Thus far, though, the only planned DirectX-10 graphics game is the PC port of Halo 2.
    [*:92c667af57]That would depend all on what you plan on doing with it. As mentioned, Halo 2 for PC is pretty much the only game that will have use for a DirectX 10 card in 2007, and possibly even in 2008. And in most cases, one could very likely live without DirectX 10 graphics support for years, just like one can live without DirectX 9.0c cards today. (and hence HDR support in many games) Personally, I might not upgrade now, but perhaps early 2007, when there's a clearer view of Vista. (pun intended)
    Quote:
    personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

    Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

    p.s. EVGA and XFX make good nvidia cards.

    A 1,000 watt PSU won't be required, much to the dissapointment of PC Power & Cooling makers of what appears to be the world's only ATX/BTX-compatible 1-kilowat PSU.

    Single GPUs won't require anything truly different when it comes to power; about 100-120 watts for a top-of-the-line GPU. Of course, dual-GPU boards are another matter, and using two of those at once...

    Let's also not forget that there will be more than absurdly expensive DX 10 cards out. I'd expect both ATi and nVidia to hold off their next series until they can get DX 10 support. So it would be the GeForce 8/Radeon X2000, or possibly the GeForce 9/Radeon X3000 that bring support. And you can rest assured that there will be "600" (mid-range) and even "300" (low-end) variations. After all, nVidia's sold more 6600GTs than any other card in recent memory.
    Quote:
    laf, ok well first off you will never need a 1000W psu for any single gfx card, soon enough ATi or nvidia will start to employ power saving technology of some sort, but thats not even the case at all, only top of the line dual gfx cards with more than 2 hds and dvd drives would require a 600W+ psu...your claim is rediculous, the wattage "requirements" will never be that high only the Amp requirement will start to get higher

    How can the amp requirements raise without the wattage requirements? Since video cards have moved to using the +12v rail almost exclusively, (instead of the +5v rail) I highly doubt they'd reverse direction and use a lower voltage, but higher amperage.
July 17, 2006 9:54:53 PM

Quote:
I said to get a rough idea of how many amps it had. When it comes to psu's you can basically divide the watts on the 12v rail by 12 and figure out how many amps are on it. Wasn't trying to get into the entire formula because it isn't necessary when it comes to psu's.


Not commenting on your math, just your logic. Most 5v rails I've seen run somewhere around 30-35 amps, so you're discounting a whole 150+ watts in your equation.

That said, both ATi and nVidia have said that the upcoming generation of chips will be the most power-hungry cards to date, and these will be the "Vista" cards. I don't know about you, but I don't want to have to have a 750w PS just for that. DX10 full compliance won't be necessary for at least a year unless you just want Halo 2. So get the 7900GT now, and upgrade to the gen after the G80/Rxxx that's coming in the next few months.
July 17, 2006 10:21:04 PM

Ok i was asking about GC but its not just about upgarding that, i'm upgrading my whole rig:Mobo,CPU,GC and Mem.

Like i say theres major developments regarding Vista, DX10 and new CPU's. I just want people opinions being on when best to do it, I'll think i'll be getting the 7900Gt card as its god value, but its a matter of when ?

Like i said November may be a good time for me !
July 17, 2006 10:59:48 PM

If your current card is not good enough to play the games you want to as prettily as you'd like, buy a gfx card now - Vista is gonna be at least Feb 2007 - what's the point of waiting that long?

IMO there's no point in waiting for the Next Big Thing with PCs, unless it's just around the corner (like Conroe).
a b U Graphics card
July 17, 2006 11:02:17 PM

Quote:
It will, however, not work with any settings that employ the GRAPHICAL part of DirectX 10, namely Shader Model 4.0. This would be just like the GeForce 4 Ti, a DirectX 8.0/SM 1.3 part, not working with Oblivion, which uses DirectX 9/SM 2.0 for graphics. Thus far, though, the only planned DirectX-10 graphics game is the PC port of Halo 2.


However the benifits of Crysis may e enough to make the DX10 bonuses worth it. Not required, but definitely enjoyed. I thnk Halo2 may be the only DX10-only app (by design) until about 2008. Even UT2K7 will have a DX9 fp24 fallback supposedly, so not even just DX9.0C, but about the same level of fallback as Oblivion (although I suspect it might play worse on an R9600 or X700 than Oblivion).


Quote:
That would depend all on what you plan on doing with it. As mentioned, Halo 2 for PC is pretty much the only game that will have use for a DirectX 10 card in 2007, and possibly even in 2008.


Use and need are different animals, like I mentioned UT2K7 and Crysis will have a use for it, just not a 'NEED'.

Quote:
How can the amp requirements raise without the wattage requirements? Since video cards have moved to using the +12v rail almost exclusively, (instead of the +5v rail) I highly doubt they'd reverse direction and use a lower voltage, but higher amperage.


I agree, if anyting they'd move to those wall socket solution we've seen from ASUS.
July 18, 2006 8:32:57 PM

Quote:
I said to get a rough idea of how many amps it had. When it comes to psu's you can basically divide the watts on the 12v rail by 12 and figure out how many amps are on it. Wasn't trying to get into the entire formula because it isn't necessary when it comes to psu's.


Not commenting on your math, just your logic. Most 5v rails I've seen run somewhere around 30-35 amps, so you're discounting a whole 150+ watts in your equation.

That said, both ATi and nVidia have said that the upcoming generation of chips will be the most power-hungry cards to date, and these will be the "Vista" cards. I don't know about you, but I don't want to have to have a 750w PS just for that. DX10 full compliance won't be necessary for at least a year unless you just want Halo 2. So get the 7900GT now, and upgrade to the gen after the G80/Rxxx that's coming in the next few months.

My logic is as follows graphics cards draw their current from the 12v rail. So when it comes to graphics cards I don't care if the 5v rail has 1000 watts on it. I was pointing out the watts and amps that matters when it comes to graphics cards.
July 18, 2006 9:25:49 PM

Quote:
Use and need are different animals, like I mentioned UT2K7 and Crysis will have a use for it, just not a 'NEED'.


My problem is that my Need Dog keeps trying to overeat and then my Use Dog has to work all night to digest it all. Then I get to clean up all the dog poop.
July 18, 2006 10:19:03 PM

wont Dx10 cards come out before vista??
July 18, 2006 11:05:36 PM

Quote:
wont Dx10 cards come out before vista??


I have no idea. But even if they did, how much advantage would they have?
a b U Graphics card
July 19, 2006 12:30:14 AM

Well how much advantage did the R9700 have over the GF4/R8500?

GF6 over FX?

Seriously it's not just about the DX support but the raw horsepower of the new cards.
July 19, 2006 1:21:07 AM

Quote:
Well how much advantage did the R9700 have over the GF4/R8500?

GF6 over FX?

Seriously it's not just about the DX support but the raw horsepower of the new cards.


That's why I asked. I haven't seen any of the new cards, or any benchmarks. What have you seen?
a b U Graphics card
July 19, 2006 1:36:03 AM

Well the only DX10 card I have seen is not impressive, and I doubt it'd challenge a GF7600 or X1650 (maybe an X1600).

However based on the specs we've got to chat about, and the statements of ATi and nV, I have a feeling that we're in store for 2 nice boosts. When the card initially launch and just give us more raw fill rates, etc thanks to more components and faster speeds. And then a second boost once DX10 benifits kick in.

Hey I could be wrong, but it's unlikely, since there'd be far less motivation to fork over money if an R600/G80 can't beat an aging GF7950GX2.
July 19, 2006 1:49:50 AM

Quote:
Hey I could be wrong, but it's unlikely, since there'd be far less motivation to fork over money if an R600/G80 can't beat an aging GF7950GX2.


Sure, my assumption is that the next gen will be a solid step up. But I've been a little let down a time or two in the past when expecting great things so I'm progressing in my research to get a PhD in cynicism.
July 19, 2006 2:11:19 AM

Quote:
Well the only DX10 card I have seen is not impressive, and I doubt it'd challenge a GF7600 or X1650 (maybe an X1600).
What card is that? :?
a b U Graphics card
July 19, 2006 5:49:06 PM

Quote:

Sure, my assumption is that the next gen will be a solid step up. But I've been a little let down a time or two in the past when expecting great things so I'm progressing in my research to get a PhD in cynicism.


Oh I agree, to me there's been a few let downs, and I suspect that the R600/G80 will not be giant leaps in performance yet.

It's funny I think we are let down by the comparison to our current expectations. The R9700P wasn't that impressive compared to the GF4ti with weak titles at 800x600 or 1024x768, but as the resolutions went up and the workloads went up then the R9700P shone with it greater power on both the core and the memory. Same thing with the X800/GF6800 and GF7800/X1800.

True DX10 titles (built for DX10, with no fallbacks) will likely run like crap ont the new hardware because by then will be pushing the envelope of some other new cards, but they should perform better than current cards in DX9, but how much better who knows. Probably like 25% if history keep it's performance trend, but we could get some surprises, and I think the biggest unknown and bonus, is how they will handle once they can benifit from DX10 optimizations.

The G80 might be the next FX, but like the high end FXs people probably won't care until we are 2-3 years past their launch when finally the hybrid design might show a weakeness, but even then I doubt it'll be the disaster that the FX was because it should do very well with current titles and not requires tricks to perform better. I think no matter what both will be solid performer in DX9 and should outperform anything we have to date.
a b U Graphics card
July 19, 2006 5:54:26 PM

Here's a hint, and scavenger hunt.

Start @ HKEPC then look for the 3rd thing you see about DX10. :twisted:
July 19, 2006 8:16:00 PM

Quote:
I think no matter what both will be solid performer in DX9 and should outperform anything we have to date.


I'll start the process of building patience. "I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%."
a b U Graphics card
July 19, 2006 11:18:52 PM

I would never promise such a thing, because if it's cracker-jack good, why wait if it's in your sweet spot?

Of course for me it'd have to be a mobile solution we're talking about.

I bought the R9600P the first week because it perfectly fit my needs at the time, same with pretty much everything I buy that doesn't require financing.
July 20, 2006 1:05:05 AM

Quote:
I would never promise such a thing, because if it's cracker-jack good, why wait if it's in your sweet spot?


My promise was made a bit as a joke to show my wife. I put together three AMD rigs in the last year, one of them with XFired 1900s, another with a single 1900 and the last with an 1800. Along the way, our large male Golden vaporized my laptop and I had to replace it. I like to stagger PC purchases but the needs dictated the purchases. So I'm already queing up to replace the home office PC with a Conroe and part of the spousal negotiation process has involved commitments on my part regarding GPUs... And that's fine - I don't upgrade often and do not chase the bleeding edge. When I do upgrade, I try to get very good performance per dollar and thus I'm planning to be patient on the Conroe rig because I hope to find a decent deal on mobo/CPU/RAM. I don't really expect that to be happening for something like 4 to 6 months - maybe more. I probably won't find the time to fully tweak the three newish AMD rigs by then anyway as I am an ultra patient overclocker.

Quote:
I bought the R9600P the first week because it perfectly fit my needs at the time, same with pretty much everything I buy that doesn't require financing.


For sure. I've done the same thing. My current digital camera had barely settled onto the shelf when I nabbed it and it has turned out to be super. I'm at almost 10K images and it's barely broken in - not that the number of images is the key quality issue but it does show that I like it.
July 20, 2006 1:36:24 AM

Quote:
It boils down to this....when there are killer DX10 games available, we'll all buy DX10 hardware, until then, these threads need to die....

kthx


Thats all anyone really needs to know on this subject.

The mods should just deleat all these threads untill after DX10 is even out!
a b U Graphics card
July 20, 2006 4:46:01 AM

Quote:

My promise was made a bit as a joke to show my wife.


Yeah, and I did get that, mine was more oh an 'oh nonono can't prmise such a thing for my precious'. But reading it now it does take a more serious tone without mileys. That's what happens when you write quick a work. :oops: 

Quote:
For sure. I've done the same thing. My current digital camera had barely settled onto the shelf when I nabbed it and it has turned out to be super. I'm at almost 10K images and it's barely broken in - not that the number of images is the key quality issue but it does show that I like it.


Yep I know the feeling. And who says taking alot of pics isn't the key to quality, it's like the infinite number of monkeys using an infinite number of cameras will eventually take the best picture ever? :mrgreen:

OK, now that's funnier. Anywhoo, yeah I know the feeling. I'm curently debating what to get next for my DigSLR, and I'm really having trouble because I have free accesss to a Kodak-14n, so my motivation level is low. But I have a feeling I'd take 100s of pictures the first weekend I finally do buy just to test all the new features and compare to what I'm now used to, etc. IT was like OCing the R9600P just for the fun of 'I wonder what it can do?'.

Anywhoo main thing is that people find their comfort zone, because we can say an X1900XT/GF7900GT is the best buy, but perhaps for their level of comitment or budget they really should be getting an X1800GTO/GF7600GT or maybe waiting for DX10 is pointless for them since the mid-range might not fully ship until next spring (although there have been rumours for both of full lineups within 2 months of launch, but I wouldn't put money on that).
July 20, 2006 5:29:37 AM

Quote:
And who says taking alot of pics isn't the key to quality, it's like the infinite number of moneys using an infinite number of cameras will eventually take the best picture ever? :mrgreen:


Yea, practice may not make perfect in my case but at least I get to eat a bunch of bananas!
a b U Graphics card
July 20, 2006 5:43:16 AM

Dang, I didn't notice until just now I misspelled MONKEY! :oops: 

BTW, the way I look at it, if I take 3 pictures of the same thing, maybe one of them will be a keeper, and thanks to digital, erase the others. 8)

Still when my girlfriend took 10 picutres of the same rooster when we were in France in the first day of our vacation with a FILM camera I did say, WTF?!?
when I got them back from the lab. 8O At first I thought we got double prints. :lol: 
July 23, 2006 11:51:17 PM

Quote:
However the benifits of Crysis may e enough to make the DX10 bonuses worth it. Not required, but definitely enjoyed. I thnk Halo2 may be the only DX10-only app (by design) until about 2008. Even UT2K7 will have a DX9 fp24 fallback supposedly, so not even just DX9.0C, but about the same level of fallback as Oblivion (although I suspect it might play worse on an R9600 or X700 than Oblivion).

Use and need are different animals, like I mentioned UT2K7 and Crysis will have a use for it, just not a 'NEED'.

I agree, if anyting they'd move to those wall socket solution we've seen from ASUS.

Well, I did actually pause and think on that word quite a bit; originally, I was going to put "need," but then I realized that as of date, 99.999% of games only "need" a DirectX 8.0 card; Oblivion's the only major title that I know of, to date, that even needs a DirectX 9 card!

I wasn't aware of Crysis, though; I'm not sure what use it might have for SM 4.0.

Oh, and I think those "wall socket" solutions are perhaps a stupid idea. Yes, it circumvents the need for a separate processor, but those things are notoriously unstable and fragile. I have enough cords running around my desk already...

Quote:
wont Dx10 cards come out before vista??

I personally have no clue. I'm not even sure if R600 and G80 will be actual DirectX 10 hardware. Because ATi's still got R580+ (X1950) and RV575 (X1700) in the wings, I'll guess they're holding off on R600 for now, and that nVidia might be as well, what with all their new focus on "quad-SLi."

Quote:
That's why I asked. I haven't seen any of the new cards, or any benchmarks. What have you seen?

All I've seen are rumors (albeit mildly strong ones) and nothing more. AS you might've heard, R600 supposedly does move to a unified shader arcitecture, and will have perhaps 64 shader units; each has 2 ALUs, so it can act as either a single pixel shader, or as a pair of vertex shaders. (two units to process a vertex in one clock cycle) This will provide perhaps only a modest gain over the X1900 series in terms of pixel power, but a potential bonanza when it comes to increasing vertex power. Given that the R 580+ uses GDDR4, the R600 will likely use the same. I'm not even willing to guess at anything else on that chip.

As for the G80, the prevaling opinion seems to think it will be "32-pipeline." I'm not sure if they're heading for a unified arcitecture as well, though.

I'm fairly certain you've already heard the above; frankly, I can't find much out about anything here.

Quote:
Here's a hint, and scavenger hunt.

Start @ HKEPC then look for the 3rd thing you see about DX10. :twisted:

Some people do indeed forget that there are companies other than ATi and nVidia...
July 24, 2006 12:39:08 AM

Ugh, a sad excuse for a DX10 part. :lol: 
July 24, 2006 2:09:58 AM

Quote:
Ok everybody is going on about DX10, is it really such a big thing ??

I'm looking at maybe updating my PC when Conroe is available, which i believe is very soon, and with it my GC.

I was looking at buying a 7900GT 256mb, but i'm not sure the way everybody is going on about DX10.

1. When is DX10 suppose to be coming out ??
2. Would a 7900GT card not be able to play DX10 stuff, with a bios/driver update, surely it must.
3. When should i update my rig, not bothered if its 2,6 or 9 months ?


1. The DX10 API is set for some time next spring but DX10 GPUs are comming next month. Intel next month is set for an intergrated and in a slot card release. Nvidia in about 2 months will release G80 with SM4.0. ATI wil release its 80nm GPU's in November and its 65nm GPU's in December.

2. Yes but your talking about emulations which will cut its performance down quit a bit. The 7900GT will only be about half as fast as the top DX10 GPU's so if you emulate up to DX10 your going to see performance only a little higher than a entry level DX10 GPU.

3. I would wait as waiting never hurts and options only get higher performance as time goes on but thats a quesion of if your current system will due until then.
July 24, 2006 4:05:08 AM

Quote:
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.


I have read plenty of articles about this and I agree with you. This situation will be like Intel. Intel's Prescott processors required a large amount of power. As technology devolped, Intel Core Duo 2 (Conroe) came out which improved performance while greatly lowering power requirements. Why couldn't this be the same for graphics cards?
a b U Graphics card
July 24, 2006 4:40:13 AM

Quote:

I wasn't aware of Crysis, though; I'm not sure what use it might have for SM 4.0.


Well the true use I'm not sure, other than some great performance benifits like memory paging to allow for much larger textures, pre-caching, greater use of efficient geometry modeling (which should help foliage the way geometric instancing did, except likely better, thus lush greenery without the hit of Oblivion), there's talk of soft-self-shadowing, soft particles with diffused lighting (making jacob's ladder light effects through foliage/clouds/water/smoke more realistic), and I would presume greater use of true parrallax maping finally. The addition of Direct physics will likely at to DX10, but have nothing to do with SM4.0 of course, crysis being the VPU-physics darling, UT2K7 being the PPU's.

Quote:
Oh, and I think those "wall socket" solutions are perhaps a stupid idea. Yes, it circumvents the need for a separate processor, but those things are notoriously unstable and fragile. I have enough cords running around my desk already...


True I just think if people are anal about the whoel PSU quality and amperage across the rails, this ensures that the cards have the power that ATi and nV expect, although as you say they can be dodgy, and of course get a bit hot.

Quote:
AS you might've heard, R600 supposedly does move to a unified shader arcitecture, and will have perhaps 64 shader units; each has 2 ALUs, so it can act as either a single pixel shader, or as a pair of vertex shaders. (two units to process a vertex in one clock cycle) This will provide perhaps only a modest gain over the X1900 series in terms of pixel power, but a potential bonanza when it comes to increasing vertex power.


Yeah and that's the thing, because it's still only 16 ROPs on top of the increase pixel shader units (even at current dedication of Vertex acting in Geometry in an equal number, you still have potential for 8 extra pixel shader units, and at an increased clock speed as well, so there's alot of potential there, and in situations that can heavy load either pixel or vertex we probably won't see an major increase in avg or max framerates, but likely see a jump in the minimum fps, and hwo often we experience crushing dips.

Quote:
As for the G80, the prevaling opinion seems to think it will be "32-pipeline." I'm not sure if they're heading for a unified arcitecture as well, though.


I would think not, hybrid makes sense from their long term perspective (IMO expect them to do what they did with the GF6600GT following ATi's lead somewhat and introduce the changes in their mid level product, so likely the 1st refresh after the launch of the mid-range will be a unified design if not that card itself (unlikly and riskier, but could happen) . The hybrig will allow for fixed pizel layout but unified geometry and vertex which would be a wise choice for hybrid if they aren't going ful out unified.

Quote:
Here's a hint, and scavenger hunt.
Some people do indeed forget that there are companies other than ATi and nVidia...


LOL!

Exactly, heck some people even forget Intel, and probably the first one to market will be the biggest, but it'll also likely perform worse than all others for that generation, but I wouldn't be surprised if we see the GMA965 outperforming the early GF7300s (before the GT refresh) and X1300HMs.
a b U Graphics card
July 24, 2006 4:42:42 AM

Yeah, but it's nice to see something, instead of thinking they were just going to fold up and go away.

BTW< they do video playback pretty well so they're likely targeting the HTPC market, and I wouldn't be surprised if they do quite well once they learn how to market their wares.
July 24, 2006 4:53:40 AM

EDIT: Proof reading my own post lead to me one conclusion: I have poor written expression right now, and I shall retire because a much needed sleep is in order. I've been extremely busy for the past couple days...and I missed you guys :lol:  :lol:  :lol: 
!