Sign in with
Sign up | Sign in
Your question

x1900xt:Wait for DX10 or buy?

Last response: in Graphics & Displays
Share
July 10, 2006 7:47:40 AM

So i found a x1900xt for 280$


should i get it, or wait for DX10?

I currently use a x800pro, and i would keep using.


soon 2 be rig specs:
Conroe e6700 OC
dfi lanparty UT RD600
2x1gb ddr800

More about : x1900xt wait dx10 buy

July 10, 2006 9:50:15 AM

X800 is a good card, I'd hang on to it 'til DX10 comes, but only you can make that decision really.

Also I'd go for the 6600 Conroe rather than the 6700...it's only 270mhz which you can easily overclock to.
July 10, 2006 11:07:35 AM

It also depends on whether you plan to upgrade to Vista, and if you are going to subscribe to DX10. I haven't heard the latest update on that whole fiasco, so if they aren't going to make you pay a subscription for DX10, please disregard.
Related resources
July 10, 2006 1:27:34 PM

X800Pro is good enough for you to wait until DX10 cards.
July 10, 2006 5:04:12 PM

ive thought up of some other ways.

Buy x1900xt, now, build a K8L rig in 1 yr 6 months upgrade w/ dx 10 there.

Buy a good gaming LCD monitor, re use x800pro. Buy dx 10 summer time

wait for dx10.
July 11, 2006 5:44:32 PM

well actually I would recommend upgrading a bit but maybe not to a x1900xt; let's say a 256mb x1800xt for example (which cost around $234 on newegg), because there are still a lot of high end games due before DX10 and vista come out (such as medieval 2 total war). but other than that you should also try to make your new rig vista-proof, by checking future compatibility with vista.
July 11, 2006 7:12:00 PM

whats the major performance difference between the two?
July 12, 2006 11:45:35 AM

With all the recommendations to wait for DX 10 cards, anyone seen price quotes on those cards?
July 12, 2006 12:54:18 PM

Quote:
whats the major performance difference between the two?


What you should be asking is: "What is the performance difference between the current cards and the next gen". It has absolutely nothing to do with directx 10, which is a software abstraction layer like opengl, not hardware.

Yes, the next gen cards will perform better, no, this has nothing to do with dx10. This is painfully obvious because you will see the same performance gains through other graphic library interfaces like opengl.

Please can we stop this dx10 worshiping? if anything the performance difference between dx9 and d10 will be slower running the same hardware. You have to remember, as far as nvidia/ati are concerned, dx10 is simply a software driver interface change, and nothing to do with the hardware.

Do you guys actually think the hardware will be windows specific? i don't think so. It will run on linux, bsd etc etc. Hardware is OS independent, only the driver will change. So don't expect much in the way of performance increases between dx9/dx10 running on the same hardware.
July 12, 2006 12:57:51 PM

If he's already going to be spending $234, he might as well spend the extra $46 on the X1900XT.
July 12, 2006 1:44:57 PM

Quote:
Do you guys actually think the hardware will be windows specific? i don't think so. It will run on linux, bsd etc etc. Hardware is OS independent, only the driver will change. So don't expect much in the way of performance increases between dx9/dx10 running on the same hardware.


You're half right. It's true that Linux and other open-source OSes that don't use DirectX natively (I emphasize natively) won't see a benefit of DX9 versus DX10.

However...

The architecture of the card in question is designed around whatever renderer it will be using. So, it is impossible to update the card's driver or flash its BIOS and expect it to fully support DX10, or for the sake of arguendo OpenGL 3.0 (when/if released), or whatever.

Plus, the newer cards, regardless of the chosen renderers, will be faster than the previous generation anyway - the point is moot.
July 12, 2006 1:54:41 PM

i agree with the guy above me that if u are gonna get a card now get the x1900xt not x1800xt.

whether u want to wait for dx10 or not is completely up 2 u and how pationt u r.
there is no correct thing to do.

any1 hav any idea if the dx10 cards will be expensive or not
July 12, 2006 2:28:34 PM

True, think I'll grab a 1900 XT for the next year or two. Its a solid performer and too much else is in the air atm to sit around and wait. The prices are dropping nicely, so I'm hoping to pick one up around the $300 mark with sales or rebates in the coming months.
July 13, 2006 9:23:09 PM

im still using a ati radeon 7500 all in wonder, and i will wait for the dx10 series cards(but i gues im not much of a hardcore gamer...lol not with the pc im using now) :wink: ) I am pwersonaly waiting for the dx10 ati cards, they wil have ddr4 which wil be quite a bit faster than the ddr3 memory, and itl use les power, but i gues the gpu card in tiselfe wil be a power hog anyway,

btw i heard that dx10 cards wont run on windows xp , is that true? wil they only work on vista? or those are just weird rumors?
July 13, 2006 10:01:36 PM

Of course the G80 and R600 will run on XP, without running on XP/Win2K you'd lose about 99% of your potential market. Even after Vista launches, that'd still be about 95% of your potential market for the first year.

The thing is only some features will be exposed with DX10/Vista, but basically the G80/R600 will act like their siblings on steroids in the XP/DX9 environment.

With Vista and DX10 they'll get add features, like clear up the acne of those steroids! :twisted:
July 13, 2006 10:09:59 PM

Thank You for the information, i was woried that i would have to get vista (considering that i was planing to wait for the dx10 cards), and to be honest i planed to use an oem version of windowsxp home edition (to minimize my costs :wink: ) ...so i gues i can build up my new system as soon as dx10 gpu cards will be available, as I previously planed on doing, considering that conroe is lounching tomorow (....today 12:01...ok i gues thats tomorow :wink: ) , thx again for this good info, i was begining to wory that vista would have to be my first os on my first build (almost from all new parts, except the hd's, optical drives and case, and the monitor lol) pc :wink:
July 13, 2006 10:30:52 PM

Just on a side note illuminatirex, you'll want XP Pro to take advantage of that delicious new dual core you'll be buying
July 13, 2006 10:35:17 PM

IIRC the new versions of Home and MediaCentre have dual core support.

I never bothered to look into how, but heck they're selling most of the CoreDuo laptops with XP-Home, so I would guess they've upped support.

I ended up buying XpPro for that very reason, after upgrading from Win2K. But looks like things may have changed, although it's worth looking into to be sure.
July 13, 2006 10:39:30 PM

Interesting, thanks.

Of course now that they're including dual core support for all of their Windows products the EU will force them to remove dual core support because it's making people think they need a dual core CPU thus creating a monopolistic market which warrents $1+million in fines every day.

Does anyone actually have a copy of Windows XP-N????
July 13, 2006 10:52:58 PM

thx for the info (and to the poster above) i actualy didnt think of this, but i think windows xp home should suport core2 duo's quite fine, sonsidering there are quite a bit of people using win xp home with the 900 series cpus
July 13, 2006 11:25:15 PM

OK here's the skinny, on why YOU can use XPHome for DualCore, but I would still not be able to use it if I still had my editing rig which was Dual CPU.

Home supports both hyperthreading and dual-core one 1 CPU. You will see it with the performance bar and assignability like Hyperthreading but will not have the same affinity functions like multiple CPUs.
(edit: strike that, supposedly you can set affinity in Home as well, so not sure the overall difference other than limiting the # you can have/control).

Pro supports both hyperthreading, Dual core, and multiple CPUs. (Although M$ wants you to get another license for multiple dual core CPUs. :roll: )

http://download.microsoft.com/download/9/8/f/98f3fe47-d... System Support for Dual-Core

http://www.amd.com/us-en/Weblets/0,,7832_8366_7595~9536...
July 14, 2006 12:23:20 AM

thank you for the as always helpful information, lol i dint know that m$ wants to do that(the dualcore licensing)....that seems to be just weird, thx for the l'inq's
July 14, 2006 1:09:28 AM

This is partly aimed at the ape, but there are many others who seem to forget this...
Moving to DX10 will be expensive, for many reasons. First, if you want to use DX10, you must upgrade to vista. Putting a DX10 card in an XP machine will cause it to use DX9. I haven't heard dittly about being able to use "a sprinkling of features that dx10 offers" If you want DX10, you MUST have an OS that can use it.
Second, last I heard, "and itl use les power" Is VERY far from the truth. Read this: http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770 Thats right, top end cards will pull around 300W. To quote the article, "we won’t see an increase in power consumption, rather a decrease for the following generation." More reasons to wait for the second gen DX10 cards. (lower power draw, more DX10 games, and SP1 for vista.)
Third, I don't believe this is true either... "dx10 is simply a software driver interface change, and nothing to do with the hardware." If this were true, would they still be adding a Geometry shader? http://www.beyond3d.com/forum/showthread.php?t=25760
Fourth "the newer cards, regardless of the chosen renderers, will be faster than the previous generation anyway" I don't know how true this is. I would love a link that shows it. I haven't seen any benchies showing the DX10 cards in DX9 mode.
Frankly, its my second comment that seals it for me. 300W to power a highend card, forcing me to buy a 1KW psu or a second one, right there makes it to expensive. If you wait for the second gen cards, you'll have a card that won't eat your wallet while gaming. Vista will be more stable by then and you might even have a few DX10 games. Buying the first gen DX10 cards simply doesn't make much sense to me.
(p.s. I know I may have insulted many of the "great" posters from this site, I await your petty name calling... I know calling people a noob makes you feel so much better, and somehow right.)
July 14, 2006 2:28:34 AM

im not shoor but isnt the r600 a "semi second" generation? the xbox 360 used the firs generation.
July 14, 2006 2:48:23 AM

I'm not very familar with the 360 specs. I believe the chip is based off the x1800 cards, that or the x800 card. But its very modified. I'm talking about the first gen DX10 cards, those that haven't been released yet, and will carry the DX10 title. (is the 360 DX10???)
July 14, 2006 3:15:07 AM

i read somewhere that the xbox 360 has an r600 (xbox version/or whatever tis called) in it, and the r600 cp chips will be a "second" or "semisecond" generation ...and in regards to the 360 being dx10, i honestly doubt it, im not much into consoles/info about consoles so i cant say a thing about it, but i think its dx9
July 15, 2006 4:27:49 AM

Quote:
This is partly aimed at the ape, but there are many others who seem to forget this...
Moving to DX10 will be expensive, for many reasons. First, if you want to use DX10, you must upgrade to vista. Putting a DX10 card in an XP machine will cause it to use DX9. I haven't heard dittly about being able to use "a sprinkling of features that dx10 offers" If you want DX10, you MUST have an OS that can use it.


Except you forgot that there are different versions of DX9, and what Vista will also have is that DX9 with a sprinkling of DX10 included (expect direct physics to be a possible candidate not exposed before in DX9, but not limited to DX10), but Vista will not be limited to DX9.0C but what has been always refered to as DX9.0L/V or WGF1.0

Quote:
Second, last I heard, "and itl use les power" Is VERY far from the truth. Read this: http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770 Thats right, top end cards will pull around 300W. To quote the article, "we won’t see an increase in power consumption, rather a decrease for the following generation." More reasons to wait for the second gen DX10 cards. (lower power draw, more DX10 games, and SP1 for vista.)


First of all that's not an unbiased view of the GPU future, that's basically a PR statement by OCZ to sell a specialized VPU PSU! :roll:

Now it can go both ways for two reasons;
A) the same architecture 'could' use less power, but when pushed all out it could use more.
B) Developement like unified shaders and shared functional units (as recently mentioned about the G80) could reduce the power consumption for the whole chip compared to before the changes, but considering ATi and nV's penchant to push things to the limit, alll those gain may mean it's 300W instead of 400W or 200 instead of 300, but it's unlikely that they will reduce the power consumption by too much. At this point the only people who know have NDAs between that info an people like Anand.

Quote:
Third, I don't believe this is true either... "dx10 is simply a software driver interface change, and nothing to do with the hardware." If this were true, would they still be adding a Geometry shader? http://www.beyond3d.com/forum/showthread.php?t=25760


I don't think you understand the context of that statement. DX10 is a software driver interface change (with additional features over the previous version); however in order to take advantage of those changes/features then you will need new hardware to get the most out of it.

Quote:
Fourth "the newer cards, regardless of the chosen renderers, will be faster than the previous generation anyway" I don't know how true this is. I would love a link that shows it. I haven't seen any benchies showing the DX10 cards in DX9 mode.


Well for now you'll just have to trust the IHVs, and that they know their products. Both ATi and nV have stated that their next cards will be faster at DX9 games/benchmarks than current cards. And based on the supposed designs and clock speeds it makes sense. The R600 is adding potentially twice as many vertex shaders (if youkeep the pixel shaders fixed @ 48 which it doesn't need to be either), the G80 also doubling it's vertex engine potential (16 unified units for either geometry or vertex, and in DX9 that would be just vertex), and then both will also be faster cores and have faster memory. So it's actually more of a 'well duh!' kind of statement from the two of them than anything truely surprising.

Quote:
Frankly, its my second comment that seals it for me. 300W to power a highend card, forcing me to buy a 1KW psu or a second one, right there makes it to expensive. If you wait for the second gen cards, you'll have a card that won't eat your wallet while gaming.


So? This person is talking about an X1900XT, and these are cards for the enthusiasts not penny pinchers. And no one is forcing you to do anything. The Xbox360 is always there for you, so don't complain about future technology, it's really annoying.

Quote:
Vista will be more stable by then and you might even have a few DX10 games.


DX10 games are slated for before second generation DX10 cards, heck I wouldn't be surprised if you have a small number before the first generation's refresh.

Quote:
Buying the first gen DX10 cards simply doesn't make much sense to me.


No one was worried about that. They aren't for you anyways, you weren't invited. :tongue:

Quote:
(p.s. I know I may have insulted many of the "great" posters from this site, I await your petty name calling... I know calling people a noob makes you feel so much better, and somehow right.)


Actually a n00b making such uninformed statements doesn't insult people it's just annoying because someone has to refute your silly statement and dispel all your myths, that simply confuse other posters. Big mistakes that don't rely on supporting evidence, not even in a theoretical way, simply goes to get other people messed up. Like the following;

Quote:
I believe the chip is based off the x1800 cards, that or the x800 card. But its very modified.


Completely different. Other than the basic concept of VPUs, it's unlike anything before it, here edjumikate yourself;
http://www.beyond3d.com/articles/xenos/

See that information is readily available and yet you know nothing about it, but yet you want to expound on DX10. :roll:
July 15, 2006 6:25:21 AM

"Vista will also have is that DX9 with a sprinkling of DX10 included (expect direct physics to be a possible candidate not exposed before in DX9" Reread what I said. "First, if you want to use DX10, you must upgrade to vista. Putting a DX10 card in an XP machine will cause it to use DX9." I didn't say anything about Vista using DX9L, I said put a DX10 card in XP, and your "stuck" with DX9c.
"First of all that's not an unbiased view of the GPU future, that's basically a PR statement by OCZ to sell a specialized VPU PSU!" Very possible, but the very first sentence in that article says "ATI and NVIDIA have been briefing power supply manufacturers in Taiwan recently about what to expect for next year’s R600 and G80 GPUs." OCZ isn't the only people beefing up their powersupplies. The info also claims to come from ATI and Nvidia, which makes sense. They wouldn't want to spring 300W video cards on the public, only to find out PCP&C's 1KW psu is the only one who could use it...
"in order to take advantage of those changes/features then you will need new hardware to get the most out of it. " Most out of it or any? I know ATI has been simulating DX10 on the x1XXX line of cards. Is it simulating or is it real?
"Well for now you'll just have to trust the IHVs" I admit this is most likely to be true. Few people, Intel being one of them, releases a new architecture thats slower then the older one. I'd still like to see benchies that confirm this, DX10 in an XP machine running DX9 code faster then DX9 in an XP machine, but I'll have to wait till the end of the year. (Q3 at least.)
"these are cards for the enthusiasts not penny pinchers." Granted, but that doesn't mean you need to act like an idiot. X1900XTX now, then upgrade to a second gen DX10 card. (or third, just not the space heating first gen...)
July 16, 2006 2:35:22 AM

Quote:
"
"in order to take advantage of those changes/features then you will need new hardware to get the most out of it. " Most out of it or any? I know ATI has been simulating DX10 on the x1XXX line of cards. Is it simulating or is it real?

"these are cards for the enthusiasts not penny pinchers." Granted, but that doesn't mean you need to act like an idiot. X1900XTX now, then upgrade to a second gen DX10 card. (or third, just not the space heating first gen...)


reply to the first part i think its a simulation of dx10, so as i mentioned in a diferent tread, there might be a way to get maby a litle bit of dx10 efects in win xp (thats what they are doing, are they not? :wink: )

part 2:
once again...isnt xbox360 using a "first gen" chip, that ati is planing to use (or will use similar architecture) in their r600? and in regards to the "space heating first gen" there were couple bits of info floating on the net that ati is redesigning their heat sink/fan's so they will be more eficient and much more quiter, there were some rumors that ati might use some zalman parts (but i doubt that, it would increase the price of the cards, but on the other hand....we never know what to expect :wink: ), but partialy i agrie with You, the faster teh card the more heat it might generate, but maby ati learned something from intel :wink: and might "build" something that consumes a bit less juice, and gives back more......nah...doubtful (but that would be nice :wink: )
July 17, 2006 5:27:18 PM

It would be better it you used the quote system here instead of quotation marks, makes your sentences harder to follow.

Quote:
"First, if you want to use DX10, you must upgrade to vista. Putting a DX10 card in an XP machine will cause it to use DX9." I didn't say anything about Vista using DX9L, I said put a DX10 card in XP, and your "stuck" with DX9c.


Unless they released DX9.0D for XP. We still don't know what other features areto be added. And we don't know whether or not DrectPhysics will be Vista only (which would kill some of it's early support of that 'pre-installaed' base than ATi Havok and nV seem to enjoy refering to.

Quote:
Very possible, but the very first sentence in that article says "ATI and NVIDIA have been briefing power supply manufacturers in Taiwan recently about what to expect for next year’s R600 and G80 GPUs."


Yeah and no direct quote, and no definition of what the cards/combos are. The GF7950GX2 & X1900XTX already brush up against that level they're talking about.

Quote:
OCZ isn't the only people beefing up their powersupplies.


No, but they are one of the few making a VPU-only powersupply. And the only way to convinvce people they need oneof those is to convince them that the next cards are gonna be exponentially more power hungry than now, yet if you look at how the power is delivered, it's already maxing out single rail constraints.

Quote:
The info also claims to come from ATI and Nvidia, which makes sense. They wouldn't want to spring 300W video cards on the public, only to find out PCP&C's 1KW psu is the only one who could use it...


PCPower&Cooling's 510 could handle any card out there, now, I doubt the 1KW would have any trouble with anything even Quad SLi'ed or Triple Xfire-4-Physics rig. The reality is both ATi and nV launch products that are well beyond previous generation PSUs, however, it's also true that the enthusiasts soon find out what works and what doesn't.

Quote:
"in order to take advantage of those changes/features then you will need new hardware to get the most out of it. " Most out of it or any? I know ATI has been simulating DX10 on the x1XXX line of cards. Is it simulating or is it real?


If you know, then you'd know, wouldn't you? :roll:
You answer your own question really.

As for everything that DX9.0L/WGF1.0 has no one knows for sure what will be included in DX9.0L, other than the M$ flunkies and people under NDA, but the few things that are known indicate it should be better with branch handling and greater parallelism. It would also stand to reason that

Quote:
"Well for now you'll just have to trust the IHVs" I admit this is most likely to be true. Few people, Intel being one of them, releases a new architecture thats slower then the older one. I'd still like to see benchies that confirm this, DX10 in an XP machine running DX9 code faster then DX9 in an XP machine, but I'll have to wait till the end of the year. (Q3 at least.)


Yes, however more realistically, you show me how they could be slowe when they have more features, with higher speeds? I'd say that the base point is that they will, and I'd believe otherwise only once shown different. And since we're talking theory, then all evidence points to R600/G80 being faster than X1900/GF7900.

Quote:
"these are cards for the enthusiasts not penny pinchers." Granted, but that doesn't mean you need to act like an idiot.


You're the one acting like an idiot complaining about prices. Jeez, seriously, perhaps you need to refresh yourself with the cost of PCs over the years. And complaining about the price of cutting edge hardware means you really aren't an enthusiast. I might not buy an X1900/GF7950, but I'm not going to complain about their prices. :roll:

Quote:
X1900XTX now, then upgrade to a second gen DX10 card. (or third, just not the space heating first gen...)


Once again space heating doesn't even really matter. If people cared about power/heat in the enthusiast market, they'd be running VIA chips, and talking about how cool their systems run PCMark & 3Dmk01, and little else. :roll:
July 17, 2006 5:41:23 PM

Higher speed doesn't necessarily mean higher temps.

Heck even higher power draw doesn't mean higher temps (just more heat produced). The efficiency of the cooler can negate that, as well as if a chips produces less internal noise and requires less voltage to overcome than noise, then you can achieve higher speeds with less heat/temp.

So it depends alot on the design, just like the 7900GT is faster (hz) than the GF7800GTX, but uses less power and generates less heat.

It's a combination of things, from transistor number, chip efficiency, and cooling (which includes surface area dissipation). All of those things add up to temperature. Heatis different, but even if we were talking about the amount of heat added to the universe, you're talking about something that doesn't necessarily rely on speed so much as the voltage/power required to achieve those speeds by overcoming the internal inefficiencies (which also increase with heat/speed).
July 17, 2006 6:12:18 PM

how about we all stop whining and keep what we have now and then buy new things once vista comes out! that works too!
July 17, 2006 6:23:27 PM

Quote:
It would be better it you used the quote system here instead of quotation marks, makes your sentences harder to follow.

Thanks, I'll work on that.

Quote:
Unless they released DX9.0D for XP. We still don't know what other features areto be added.


I was referring to things that exist now. MS might not release another version of DX for XP. They might also release a hacked version of DX10 for XP, emulating whatever they can't do due to the difference between XP and Vista. Rather then speculate as to what might happen, I prefer to deal as much as possible in the here and now. I have seen this thread dozens of times before. I still haven't seen anything that makes me think moving to DX10 WHEN IT IS FIRST RELASED, is a good idea. Show me a link where running a DX10 card on XP will be faster then a DX9 card. Its probable it will be faster, but I'd like to know by how much before telling people they should get one.

Quote:
Yeah and no direct quote, and no definition of what the cards/combos are. The GF7950GX2 & X1900XTX already brush up against that level they're talking about.

Have you read this article? Watts are supposed to be between 170+ and 300W. I would assume that the 170W cards are the lowend and the 300W cards are the high end. I get the feeling I misunderstood the point your trying to make.

Quote:
No, but they are one of the few making a VPU-only powersupply. And the only way to convinvce people they need oneof those is to convince them that the next cards are gonna be exponentially more power hungry than now

And why does OCZ think DX10 cards are going to be more power hungry? Perhaps because ATI/Nvidia told them?

Quote:
As for everything that DX9.0L/WGF1.0 has no one knows for sure what will be included in DX9.0L, other than the M$ flunkies and people under NDA

Again, who cares? DX9.0L is a vista thing, not XP. If you want to use DX10 or DX9.0L, the version of DX9 Vista will use, then you have to use Vista. I don't know about you, but I never suggest to people that they should upgrade to a first release MS OS. If you want to, be my guess, but I'm not. I like my OS small (vista isn't) and stable. (vista unknown, but I get the feeling it will be more stable then previous first release OSs.)

Quote:
You're the one acting like an idiot complaining about prices

Where did I complain about prices? We probably have different definitions of the word enthusiast. There are other enthusiasts then performance.
July 17, 2006 6:25:42 PM

Quote:
how about we all stop whining and keep what we have now and then buy new things once vista comes out! that works too!

First, we aren't whining, but debating/talking about things, there is a difference. Besides, if we did stop, think of how boring the forum would become...
July 17, 2006 7:02:32 PM

Quote:
how about we all stop whining and keep what we have now and then buy new things once vista comes out! that works too!


The whole point is; why wait for Vista if you can get better performance before then. If the G80 or R600 outperform the X1900/GF7950 in current games, then why not get it then, especially if the slightly slower version beat them too?

The point is that there's little point in waiting forever, so buy an X1900/GF7900 now, then when the new gear comes out then buy the new gear if that's what you're interested in. But waiting 3-5 months for new hardware when you could be playing on a current rig makes little sense unless you are seriously dollar, limited, and even then buy a GF7600GT/X1800GTO then instead.
July 17, 2006 7:22:02 PM

Quote:
thank you for the as always helpful information, lol i dint know that m$ wants to do that(the dualcore licensing)....that seems to be just weird, thx for the l'inq's


Actually a lot of companies do that. SOme of them (I think it was Orcale) license based on # of cores rather than # of CPU's so 2 dual cores requires a greater number of licenses than 2 single cores etc. Mostly it is for the more expensive corporate software though...
July 17, 2006 8:31:02 PM

Quote:

I was referring to things that exist now. MS might not release another version of DX for XP. They might also release a hacked version of DX10 for XP,


A hacked version of DX10 to perform exactly like Dx10 is improssible based on design, adding feature to the current DX9 is possible, but they will also be minimal adds, such as Direct Physics or branchprediction calls from the host.

Quote:
emulating whatever they can't do due to the difference between XP and Vista.


Possible to some extent, but the main reason DX10 will offer new things to the new cards is shader complexity and performance boosts due to better integratiuon, both of those cannot be emulated. The emulation sofar has been about features (like geometry shaders) and at a level where they are about 10% of what they should get from hardware. There'll be little/no reason for developers to got that route. ATi went that route in their demo simply to demonstrate some of the potential in the future, not to show that you can do it now, because even they said it would be very restrictive in an in-game environment.

Rather then speculate as to what might happen, I prefer to deal as much as possible in the here and now.[/quote]

Yet you talk about space heaters, power consumption figures, and other things that aren't even based on the facts presented by the industry, just a PR clip for a PSU company. C'mon, I'd buy that in another thread, but your previous posts don't back up your statements. You're simply a pessimist, not a realist.

Quote:
I have seen this thread dozens of times before. I still haven't seen anything that makes me think moving to DX10 WHEN IT IS FIRST RELASED, is a good idea.


Nobody is saying move to DX10, however what I'm saying is moving to DX10 cards, which only offer the addition BONUS of future benifit is wise if they outperform current cards, which is what has been officially stated, and what all the current figures would support.

Quote:
Show me a link where running a DX10 card on XP will be faster then a DX9 card.


Show me a link measuring these 300W cards, let alone the Space heater aspect. If anything the previous experience would show that these card will go down in power consumption and heat versus their performance. Both will rise, but the performance will raise more in relationship.

Quote:
Its probable it will be faster, but I'd like to know by how much before telling people they should get one.


No one is forcing anyone to get one, but we are advising that INSTEAD of WAITING for the unknown, buy a solid card now, and then re-invest because as history has shown, each new generation is faster than the last, even the terrible FX, regardless of DX implementation (GF6800 was faster in DX9.0A and then added bonuses with 9.0C, which is likely exactly how the G80/R600 will act, early strong performances with a bonus later, how big the bonus is depends alot on the benifits of DX10, but that upfront better performance should be easy to determine, and that's the point, you will know immediately if it's better to buy a G80/R600 or get a second X1900/GF7900 for the DX9 games, then you have to revist it again once DX10 ships, but history and the companies' statement support the idea that the R600/G80 will be faster than any current card out there.

Quote:
Have you read this article? Watts are supposed to be between 170+ and 300W.
Quote:


Yes I have, in fact the day it was released. And we commented on it then too (and on Xbit's similar one parroting the same statements in their VPU PSU review). Notice nothing by people not trying to make a buck of that comment, which has no context.

Quote:
I would assume that the 170W cards are the lowend and the 300W cards are the high end. I get the feeling I misunderstood the point your trying to make.


I get the feeling you misunderstood the range. This is not a launch figure 300W for the top end an 170W for entry level (there's no way that the DX10 replacement of the X1300 or X1600 is going to draw 170W!) at best it's a range for teh life of the cards. The transistor count, memory amount (with the new more efficenct GDDR4 no less), speed and proccess shrink don't add up to over doubling in power consumption (250+& increase in fact). So considering the process change and more efficient memory, then what you're arguing essentiually comes down to sometthing that would require features in such an increase that the perfrmance boost would be spectacular. And then it's like the dollars vs pennies argument, if you build it people will come. Make a 500W card that performs better than Quad SLi you'dhave many interested players.

But basing you statements on what at best looks like a misinterpretation of OCZ's PR is just ridiculous. Especially since you are trying to Refute someone else's statement based on that. So whatever reason you want to defend it then your statement; "Is VERY far from the truth..." is pure rubbish! :roll:

Quote:
And why does OCZ think DX10 cards are going to be more power hungry? Perhaps because ATI/Nvidia told them?


Or perhaps they saw another product line, that they need to promote, where they expected greater power consumption, and instead nV threw them a curve ball by lowering transistor count and consumption. Also if you read the reviews, those OCZ units barely handle the amperage required on one rail now with the 100-120W card, so they're hooped! Gotta get people like you all worried about upgrading their PSUs, hey, instead I can get this OCZ add in instead (for only $10 less than that better new PSU you were 'worried' about).

Next you'll tell me about how 4GB is necessary because OCZ said so too.

Quote:
Again, who cares? DX9.0L is a vista thing, not XP.[/qote]

However we're talking about feature sets, and the parts that are DX9.0L ARE the ones you can add to DX9.0x on the XP, the primarily DX10 features wouldn't be added to DX, but emulated, so it is relevant to define what parts are translatable.

Quote:
If you want to use DX10 or DX9.0L, the version of DX9 Vista will use, then you have to use Vista. I don't know about you, but I never suggest to people that they should upgrade to a first release MS OS.


I'm not saying either way, but I'll probably have it loaded because I'll get it free for work. But regardless of whether people should or shouldn't doesn't influnece the hardware. You're getting distracted by the software when the point is the DX10 hardware doesn't require the software to work, but has the added bonus later, whether you buy day one or SP1. And that's all I've advocated from the start.

Quote:
If you want to, be my guess, but I'm not. I like my OS small (vista isn't) and stable. (vista unknown, but I get the feeling it will be more stable then previous first release OSs.)


Yadda, yadda! "I like my OS small, is like the I like my cards cheap, no power, and magical". This discussion is for enthusiasts, and the chance to monkey around with even an unstable new OS is attraction enough for many, and the potentail for graphics is far more than in any other aspect of this new OS. Other Office Apps, and heck even things like Premiere, etc aren't about to get as much of a boost as graphics and game implementations.

Quote:
Where did I complain about prices? We probably have different definitions of the word enthusiast. There are other enthusiasts then performance.


There is not other enthusiast, for graphics there are three areas; 3D performance & features and 2D quality. If you were concerned with any of those things then you wouldn't be talking about price and power consumption, you'd be talking about the potential improvements in those areas.

And let's quote you on the money issue;
"forcing me to buy a 1KW psu or a second one, right there makes it to expensive."

Enthusiast is someone who isn't interested in focusing solely on the negative, they are enthusiastic about the future and the 'new stuff', whether they buys it or discuss it.

And buying it alone doesn't make one an enthusiast either, because the guy who buys a DELL XPS system just because it's the most expensive on their list doesn't make him any more of an enthusiast either.

And if your definition of enthusiast differs from that, fine, but that's not the problem of the enthusiasts, just the problem of the wannabe enthusiast n00b who thinks he's in that reserved category.

It's funny the arguments you make about not buying it are the arguments I hear from old ladies as to why not to do anything in life. It's too hot, it's too expensive, I'm cold, I' afraid of the future, I'll wait until everyone else does it. Yeah sounds like an enthusiast to me. :roll:
July 18, 2006 2:55:40 AM

To be honest i agree with You , but on the other hand as in my situation i have an oldie radeon 7500 all in wonder gpu, and i think that building a new system with a r600 which will come out in waht ? 3 months or so( i might be wrong...im not shoor about the release date) is a much beter "deal" rather than geting a 1900xt card staying with that one for that about year or 2 and then geting a dx10 card,by waiting that fiew months more (and not being able to play some games....which i admit is just sad not being able to do so, but to be honest i do not see any interesting " realy interesting" titles nowedays, but i know of some truly" beautiful" games coming in that 2007 which are worth waiting for) and to be able to play the 2007 games in high quality (or atleast med high) I think its forht the wait, just loking at today titles, i admit there are fiew games that are nice, but if we would look at the "next gen"/ new...er games those will be "awsome", so i think its beter to wait that 2/3 months for a new gpu, btw it will suport sm 4.0 which im shoor be quite impresve, and the games with that model implemented will be one of the first "photorealistic" ones, not like some games nowedays, which are nice like half life 2 and far cry, looking at lets say far cry and crysis, i think its worth the wait to get those "crysis" efects in game, that will be the "standard" by 2008 if not the end or mid 2007, even if we look at "Haze" which is suposed to come out in Q1 07 we can see how drastically the game transition will look.

And back to the point :wink: , i planed to spend about 400 on the gpu i was aiming for the 1900xt card but if there will be a gpu (even a midrange one) which beats the 1900xt or even the xtx in gameplay, will have suport for the future sahder model 4.0, and will alow me to to use teh card for much much longer than the 1900xt (whithout upgrading, and still being hapy with the gpu :wink: ) I am willing to wait for that gpu even if it would take that 4 months for it to come out....hopefully it wont take so much time. So as usual i think that the "bang for the buck" has great influence on many people's buing decisions, especialy those that do not want to "waist" their $$ or a product that will be soon a bare minimum (im not claming that the 1900xt fits to that category...its one of the best single gpu solutions to date ....from ati...not counting the xtx edition :wink: ).............................lol is it me or are the last couple posts geting longer? (including mine) :wink:
July 22, 2006 4:45:58 PM

Quote:
Yet you talk about space heaters, power consumption figures, and other things that aren't even based on the facts presented by the industry, just a PR clip for a PSU company.


Is this still just a PR clip from a PSU company? http://www.tomshardware.com/2006/07/21/the_graphics_sta... Seems to me DX10 might throw some users for a loop.

Quote:
Nobody is saying move to DX10


WHAT??? That was the question in the first post. GameReplays asked
Quote:
should i get it, or wait for DX10?

All I'm trying to do is get people to think before they start laying down the cash. You could get a x1900 now, and upgrade to DX10 after it cuts its teeth. Or you can buy a midrange card now, and deal with the possibly painful first gen problem.

Quote:
But basing you statements on what at best looks like a misinterpretation of OCZ's PR is just ridiculous... Or perhaps they saw another product line, that they need to promote, where they expected greater power consumption, and instead nV threw them a curve ball by lowering transistor count and consumption.


Actually odds are good that nvidia's cards will raise more then ATIs. But even after reading that article I wonder if you'll change your tune. Each new generation has more transistors running faster, but creating less heat and drawing more power right?

Quote:
However we're talking about feature sets, and the parts that are DX9.0L ARE the ones you can add to DX9.0x


I'm going to have to give you this one. Everything I read said Vista was going to use DX9.0L, no one ever said XP could also. It wasn't until I thought about this that I realised XP probably could.

Quote:
just the problem of the wannabe enthusiast n00b who thinks he's in that reserved category.


I would argue this point with you, but seeing as you've jumped to name calling, I doubt I could convince you of anything... (you are thinking of the performance enthusiast, I maintain there are others... Never heard of the overclocking enthusiast? He would be running an 805 with a x1800gto flashed into a x1800XL, overclocked to 650/650+)
July 23, 2006 5:37:37 AM

If you do not have the x1900xt, which you should already own then at least enjoy what you've got. I did not buy the 800GTO just because I could buy the x1900xt. Frankly I don't care. DirectX10 doesn't matter right now.

This card is good for another three months?

Even better.

The future GPU is not going to be all that much better untill dual core video cards will start coming out. ATI all the way. I wouldn't recomend this card to anyone that doesn't need it. The only people that should get it are enthusiasts of today that wish to see the difference. Yeah, you know who you are. Once the DirectX10 comes out and you wish to upgrade, this x1900xt will become a tool for runing physics. The future of gaming is streaming multiple video cards into one monitor. For now, if you have 800GTO gaming might not be as bad as it was for me till I got the X1900xt.
July 23, 2006 4:29:49 PM

Perhaps I was a bit hasty to say that the future holds a dual core GPU, I shouldn't believe everything that I read. What would two GPU processor mean for us?
July 23, 2006 4:52:43 PM

On topic :

I'm in the exact same position as you. I'm waiting for the DX10 cards, but wanted a card to last me til then.

I just upgraded my 7800GT to an X1900XT, and I'm very happy with this decision... particularly in Oblivion.
July 23, 2006 5:24:31 PM

Quote:

Is this still just a PR clip from a PSU company? http://www.tomshardware.com/2006/07/21/the_graphics_sta... Seems to me DX10 might throw some users for a loop.


That's an Op-ed piece from the same people who were talking about PS2.0 being an FX feature not 9800 which they equated to the GF3/4 :roll: , far from the fact or 'here and now' you say you want to focus on.

BTW, what were the solid factual points other than they removed an Antec craptaculay PSU (when their previous true blues had trouble with graphics cards in the R9700/FX5800 era).

Quote:

WHAT??? That was the question in the first post. GameReplays asked
"should i get it, or wait for DX10?"


And actually that was his question, but the general concensus is to look at the DX10 card, don't worry about DX10 itself right now.

Quote:
All I'm trying to do is get people to think before they start laying down the cash. You could get a x1900 now, and upgrade to DX10 after it cuts its teeth. Or you can buy a midrange card now, and deal with the possibly painful first gen problem.


And the wait isn't contrary to anything I've said about upgrading, I say BUY NOW, and the SELL and REBUY when it looks good. That strategy has everything to do with the next gen DX10 cards, but only DX10 once it arrives, and that may add to the motivation to buy/resell, but they are not mutually exclusive events, and like I said with the specs of the G80/R600 it looks like you get the best of both worlds, fast DX9, and then when DX10 comes out you get another boost. But I am not an advocate of waiting unless someting is weeks away, not months!

Quote:
Actually odds are good that nvidia's cards will raise more then ATIs.


That's what I thought too, until this;
http://www.elitebastards.com/cms/index.php?option=com_c...

Now who knows, maybe they can cut down transistor count in the G80 like they did in the G71.

Quote:
But even after reading that article I wonder if you'll change your tune.


Why? It's nothing new to me, nor to anyone here who's followed the industry. Is it getting worse, yes, is it anywhere near the level you said it was, no. If we went on the assumption and reviewers predictions in the FX5800 era then we should be using this setup by now;
http://img478.imageshack.us/img478/9241/atir9002aod2bv5...

Quote:
Each new generation has more transistors running faster, but creating less heat and drawing more power right?


More transistors drawing more power? Not the GF71, which had less than the GF70, and drew less power yet outperformed;
http://www.xbitlabs.com/articles/video/display/evga-790...
All 3 feature from the GF70-71 were reduced, and considering the GF6600GT versus GF7600GT that happened yet again;
http://www.xbitlabs.com/articles/video/display/powercol...
http://www.xbitlabs.com/articles/video/display/gpu-cons...

BTW, to be as anally technical as you are being, how do you presume to draw more power without creating more heat? Almost all power in a chip is converted to heat (and a minor amount of radiation, harmonics, etc [less than 1%]), what is different is the ability to disipate that heat.

Now those are hardcore facts. How you interpret them about future benifits is what matters.

Quote:
I would argue this point with you, but seeing as you've jumped to name calling, I doubt I could convince you of anything...


'JUMPED' to Name Calling? :roll: SOooo quickly you forget your own post putting yourself in such a position and earning such a moniker;
"p.s. I know I may have insulted many of the "great" posters from this site, I await your petty name calling... I know calling people a noob makes you feel so much better, and somehow right."

Perhaps consider that, before posting such tripe next time.

Quote:
(you are thinking of the performance enthusiast,


Damn right, the poster is asking about X1900XT, that's not a mid-range gamer, and anyone asking about DX10 as an alternative, is not talking about the replacement for the X1600/GF7600.

Quote:
I maintain there are others...


I know, I'm one of them, but this discussion you're talking about two absolutes of the high variety and your argument against surely isn't focusing on mainstream being 100-300W.

Quote:
Never heard of the overclocking enthusiast?


Sure, I believe I've fit that bill myself, and know people who better fit it now. But do you understand that overclocking can only take you so far, and power consmption goes up with overclocking? Also I do not recommend based on overclocking alone, especially not to anyone who needs to come here to ask basic questions. People who should flash and then overclock don't need to be asking these questions.

Quote:
He would be running an 805 with a x1800gto flashed into a x1800XL, overclocked to 650/650+)


Which doesn't change a single argument here.
July 23, 2006 5:39:58 PM

Quote:
Perhaps I was a bit hasty to say that the future holds a dual core GPU, I shouldn't believe everything that I read. What would two GPU processor mean for us?


People need to stop thinking of C.P.Us and VPUs in the same way.

It would likely help your understanding of why while dual 'core/die' VPUs are 'possible' they are both improbable and functionaly useless, if you think of a 'pipeline' as a core. VPUs are massively parallel, and as such you are better of makine a 32pipeline chip than putting to 16pipeline dies/cores on one functional package.

The only time it starts making sense is when surface area becomes an issue (ie as you increase functional units, and thus transistor count, within a same process your suface area increases pretty much geometrically, and so at one point (likely not too to far off into the future) you will likely reach a point where it's cheaper, more energy efficient, and better at dissipating heat to have 2 400mil transistor dies than one BIG 800mil transistor chip, and that's when it become a decision of whether to put it on the same package (thus dual die/core) or two put two VPUs on a single PCB like a Gemini card
July 23, 2006 5:43:22 PM

I bow to your greatness sir. Obviously your mind is made up, and no amount of reading will change it. OCZ, other psu makers, and tomshardware, are only making this DX10 power requirements up. I'm glad you said buy now, and upgrade, "when it looks good". This is good advice, and it doesn't even relate to DX10.

Quote:
'JUMPED' to Name Calling? SOooo quickly you forget your own post putting yourself in such a position and earning such a moniker;
"p.s. I know I may have insulted many of the "great" posters from this site, I await your petty name calling... I know calling people a noob makes you feel so much better, and somehow right."


I don't see where I called anyone noobs, but I can point your out with ease. I make it a point to not call people names. Makes one look immature, and doesn't help anything. Frankly, I don't want to be brought to your level and then beaten with the amount of experience you have... (ok, one last parting shot, I'm done. This thread is obviously going nowhere.)
July 23, 2006 6:38:30 PM

Quote:
I bow to your greatness sir. Obviously your mind is made up, and no amount of reading will change it.


MY mind gets made up by published facts and compeling arguments, I've seen neither sofar, and history completely goes against the need for a 1KW PSU upfront for the R600/G80 (their dual GPU refreshed in 2008 sure maybe), but not the cards we're talking about.

Quote:
OCZ, other psu makers, and tomshardware, are only making this DX10 power requirements up.


Yes, it's called guessing, aka prognosticating, and I have nothing against except for when used as FUD and fearmongering like you and OCZ are doing.

Quote:
I'm glad you said buy now, and upgrade, "when it looks good". This is good advice, and it doesn't even relate to DX10.


Well actually it does, and here's why the discusion positioned as DX10 and DX10 cards (instead of G80/R600) gets confusing to most people.
DX10 itself does related to the upgrade path as does the 'DX10 cards'. Because when it looks good may be when those new cards come out and the G80/R600 spank the X1950XTX/GF7950GX2 so bad that for the price you'd want to sell you old card and your grandmother to get your hands on one; or initially the G80/R600 aren't impressive, however suddenly once Vista/DX10 ship, suddenly all new functionality (even just as DX9.0L maybe if not brought to DX9.0X) suddenly reaches that granny-selling performance. In oth situations, voila you have the "when it looks good", and they both also involve the concept ofeither DX10 cards or DX10 itself, so IMNSHO it DOES relate to DX10, even if it's a second blush relationship.

Quote:
I don't see where I called anyone noobs, but I can point your out with ease. I make it a point to not call people names. Makes one look immature, and doesn't help anything. Frankly, I don't want to be brought to your level and then beaten with the amount of experience you have... (ok, one last parting shot, I'm done. This thread is obviously going nowhere.)


Well you brought it down the dead end road, in reality the thread was pretty much dead asnd over before you came in with some 'tude and decided to go off-road with your wild ideas of FUD, which are based on alot of misinformation.

And to call you a n00b is factual, more so than your arguments. I don't start calling n00bs n00bs, but when you jump into the bull ring wearing red running around like a knob, you draw attention to yourself, and when it's FUD you're spreading then, really you deserve what you get, for the betterment of people who don't know better and start delaying they purchases and worrying about 1.21 gigawatts to power the Flux Capacitors in their PCs.
July 23, 2006 11:18:54 PM

Not sure if this has already been speculated, but anyone have a guess-timate performance advantage in % of one of the Dx10 cards (G80 or R600) over flagship Dx9 cards. I understand this might be difficult to come up with considering the limited info, but I'm trying to budget for a new rig in Dec.-Jan. and am not sure if I should just stay with my current set up and wait til' 2nd Gen Dx cards come out.

If the performance advantage is, say, about the same as two 6800's in SLI vs. a 7800 gtx, I'd be better served holding off, right?

Note: I'm already taking into account the added feature and compatability with DX10 on the new cards...
July 24, 2006 3:41:10 AM

No one knows other than 'they will be better than current top DX9 cards', and I'm not even sure if that's including the GX2 considering someojn people's consideration of it as a 'single card', but if anything I'd say the G80 is likely close to the GX2, the R600 is far too hard to predict because it is a radical departue from current design, but if ATi says it's their fastest DX9 solution, I have a feeling it's not going to leave it so close as to be up for debate, and likely a jump at least as significant as X1800 to X1900, before DX10 comes out to give it any additional help.

But the reality is the only people who know for sure have long NDAs preventing them from saying a peep.
!