Sign in with
Sign up | Sign in
Your question

Small rant: DirectX 9 everywhere, oh wow...

Last response: in Graphics & Displays
Share
March 7, 2003 10:02:46 PM

That was sarcasm btw... :wink:

Ok, I really do like nVidia's huge push for DX 9 for 79$. I admit this is a huge change from the previous MX devastations. I really have more respect towards nVidia after hearing this extreme rise in competition feeling. It is indeed turning out more interesting.

But... I say to myself like I had seen in the DirectX 8 saga: What's the point?
We're still surrounded by monkey programmers who still whip out CPU intensive games!
Back then when the geForce 3 Ti came out, the Ti200 was all the hype because it would bring DirectX 8 to mainstream. Yeah, and um, where's the real deal? How effective is that? How do we know? We hear GPUs and shaders take a huge load off the CPU, and in fact nearly negate its use. Then suddenly we don't see that, we see people not capable of properly programming. Aquanox was just about the only game I ever respected in making a Ti200 worthy to buy for DX8.
Part of what I am ranting about, comes from the geForce 4 MX series, that were without DX8 but did so well. An MX460 was simply overclocked and easily matched the Ti200 in all benchmarks, and sometimes bested it.
I have to ask myself, DirectX 9 comes to even the lowest of the lowest end.....so what?
Yeah we have the technology in the card, so how do we know it's effective? What exactly will it allow more? Old DX8 cards can render effects as long as a pixel shader is there (water effects were not possible in Morrowind unless a DX8 card's pixel shader was enabled or there). It's just, I don't damn see the whole point of having a low end card which will have DX9, if DX9 is not gonna bring some high level optimizations which make it so that a DX8 card, much higher clocked than that low end card will lose to it, simply because of DX9 being powerful. That includes that this game's nice effects by DX9 will in fact make sure that this low performance card with DX9 also hit frame rates OVER 30FPS constant, with high quality settings. Why do I say this? Well, because they tout DX9 as being there, now give it a purpose! I mean, why the heck would I equip a biplane with Boeing class engine technology, if although it can be used in the sky, the rest of the plane will lose handling performance because it is not designed for such speeds?
That's what I see in DX9 on low end cards, aside from the fact few game devs properly program for shaders to make sure performance stays dependant on graphics cards, and if it is CPU dependant, be it that it begins over 2.53GHZ, because the majority of users do not have such, and often rest in the 1.4 to 2GHZ range, and definitly don't want to buy that shiny Radeon 9700PRO card, only to find it won't go over 30FPS in a game because it's CPU dependant!
So:
-First, make sure your game is NOT CPU dependant. Aquanox is the best and most obvious example god could ever ask from us.
-THEN, once that is done, properly code and optimize the DX9 interface with the shaders being as frequently used as possible, so that this sudden boost is clearly seen when put against a DX8 card, and that DX9 cards, even if they cost 79$ and signify cheap performance, will be UP THERE thanks to the DX9 opt.
Otherwise what's the point of it, if Doom III requires such a strong card, and can enhance the quality with DX9 ones, if that low-end FX5200 Ultra can't even output 25 frames, WITH DX9?

Thank you for bearing with me on this rant, I've long wanted to get it out. I've been following the flow too long now, being all crazy that "DX x technology in y card is coming, oh man we can now benefit!".

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile: <P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 03/07/03 07:04 PM.</EM></FONT></P>

More about : small rant directx wow

March 7, 2003 10:31:20 PM

A little correction, I don't think Aquanox is a game. It's only used for benchmarking, I haven't seen anybody to play Aquanox. Somebody may play it for few minutes, like the Interactive Game Demo of 3DMark 2001

Submit your opinion <A HREF="http://forumz.tomshardware.com/community/modules.php?na..." target="_new"> Should Tom Fire Omid? </A>
March 7, 2003 11:53:09 PM

Quote:
A little correction, I don't think Aquanox is a game. It's only used for benchmarking, I haven't seen anybody to play Aquanox. Somebody may play it for few minutes, like the Interactive Game Demo of 3DMark 2001

I think you got something wrong there, Aquanox 1 and 2 are both really games and not benchmarks. Maybe you don't like them but then again they will still remain games... of course there is also the popular benchmark based on the Aquanox engine which is only available to the press...
March 8, 2003 12:32:23 AM

To really take full advantage of DX9, one needs a pretty powerful video card, with lot's of pixel pipelines and what not to take full advantage of all the goodies dx9 has to offer. I don't see the point in putting dx9 on lower vid cards if they lack the hardware to adequetly run dx9 applications.
I see it like this. Joe Sixpack, knowing a little about computers goes and buys a gf 5200 because it has dx9. He looks at the system req. of Doom 3, and sees dx9, so he says, HA, my card will blow that game away. Takes it home, and can't understand why he is only getting 10fps.
To me, it seems like another marketing ploy for low end cards, much like agp 8x. dx9 is going to be great for gamers who have Raedon's 9500 and higher, but for the average person who buys a gf 5200, I don't think he is concerned about playing Doom 3.
March 8, 2003 1:36:29 AM

Actually, I thought DX 9.0 was supposed to make coding more efficient, so that graphics cards wouldn't have to use as many cycles rendering the same effects. In fact, I thought that was 75% of the benefit of DX 9.0.


<-----Insert witty sig line here.
March 8, 2003 1:58:38 AM

Well PS 1.4 was that, I can tell you this much.
DX9 uses PS 1.4 as a subset, but how is that efficient if labelling DX9 MUST mean you are using at least one PS 2 code and not just PS 1.4? In other words, you're still on DX8.1 if you use that efficiency shader version. (which I admit being a very dramatic advancement for a small .1 version jump. Benchmarks have should a large 40% boost per clock in the Battle for Proxycon test, on the R9700PRO, which shows the advent of efficient multitexturing, correct me if I'm wrong on that last one though)

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
March 8, 2003 8:31:33 AM

to twitch: no. doesn't make coding more efficient/gpu's less work.


dx9 has tons of new features nobody really knows yet. pixelshader 2.0 (and i can't wait for 3.0:D ) are amazing powerful, displacementmapping, vertexshader2.0, tesselating, and much more is in this api, designed to use directly on hw. those features are powerful, and those features are free to use. there's just one problem: if you use them now, nearly nobody can run them. see 3dmark2003. you know why they did so much dx8.1 into it? because else nobody would have been able to run it except owners of r300 chips.
it looks like both nvidia and ati now try to make dx9 cards cheap, so possible for everyone to replace its old card by such a dx9 card. that is very important, because if you don't get more than 1000 points in 3dmark2003, you will not have a chance in dx9 games.

dx9 is very powerful, but requires dedicated hardware. as long as this hardware is not standard in every home, there will not be games for it. here, at my home, it is. and "touching" dx9 is really awesome. things never looked that beautiful before, things never looked that detailed before, things never looked that organic before..

you ever thought why there are only stupid FPS games, indoor with some stupid walls and boxes? without dx9, drawing natural acting and looking nature, grass, plants, ice, water, fire, fog, mutating huge alienblobs was all actually quite impossible.

i always though hm.. radeons.. ps1.4.. great, but.. is it great? downloaded the comparison-demo (treasure chest) from ati, where you can see pixelshader 1.1, 1.2, 1.3, 1.4 and _WOW_, 1.4 is really impressive. extremely detailed looking surfaces. now we are at 2.0, and it blows 1.4 in the wind:D 

yes, cinematic gaming will come. not yet, but dx9 is the first big step towards it. its not about how to program for it, or how fast it is. dx9 defines what a hardware has to be able in features. and the features it asks for are beautiful.

and then, there will be opengl2.0 .. and nobody cares about dx9:D  (yes, there is yet opengl2.0 hw, and yes, this summer we will touch gl2.0 .. and it will have all the features of dx9, but no restriction of dx9. gl2 will define the standard for the next 10 years, and change the developer-world. remember, gl2 runs on windows, linux, mac, and everywhere else..)

"take a look around" - limp bizkit

www.google.com
March 8, 2003 11:41:10 AM

Quote:
you ever thought why there are only stupid FPS games, indoor with some stupid walls and boxes? without dx9, drawing natural acting and looking nature, grass, plants, ice, water, fire, fog, mutating huge alienblobs was all actually quite impossible.

Dave, I forget. Do you have graphics coding as sort of a hobby, or what?
If so, do you have any self-made demos to show us?

<font color=red>I´m starting to feel like a real computer consultant.</font color=red>
March 8, 2003 10:51:20 PM

can we acually wait till the cards have benchmarks before we start slamming them for starters?

As far as putting DX9 on cards...I see using PEICES of the DX9 hardware in games...A lot of games use peices of DX8 hardware, and i see people going around "GAMES DON'T EVEN USE DIRECT X8!!" Well who's to say that the card will run Doom III with a few extra effects of DX9 and still keep a constant 30 fps?

I remember all the "The GeForce FX will crush ATi's Card" crap last year, now look at it. I agree Nvidia's first release is bad, their is a possability they can always make up for it -_-

"What kind of idiot are you?"
"I don't know, what kinds are there?"
March 9, 2003 8:09:24 PM

One thing I fear, is the R9500, being less performing than a Ti4600 in normal modes, will automatically go down in Doom III. Remember, the Ti4600 was stated as "ok" performing, by Carmack. If the R9500PRO is below its performance, I am afraid it may not make many owners happy, no?

Now sure, Doom III will use DX9, but last I checked; BARELY any of it. So my question to you is, how DO YOU think it will fare?

I do consider PS1.4 inclusion, and DX9 tech in the mix.
I definitly won't buy the card if I learn it can't run Doom III like the R9700PRO can, if that one is considered the minimum.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
a b U Graphics card
March 9, 2003 11:13:06 PM

The 'Minimum' will likely be along the lines of an 8500 or 9000 and a Gf3ti. I think ID is going to ensure that it will play 'minimally' on the majority of cards out there, and considering that the majority own GF2/MX cards I would think that the 9k and Gf3 will sort of be the 'Minimum', however I would think that 'Recommended' may end up being a Gf4. I would doubt that the Gf4 would be preferable to a 9500pro or even a non-pro. But then again I don't know exactly how the optimization might run. I would think that if you want to turn on ANY features that the R9500+ will be the cards to go with as I've read reviews that say that D]|[ will be using PS1.4 primarily for the advanced stuff.
Anywhooo, we gots'ta wait, but I'm satisfied that the 9600 I'll get will run the 'BASICS' at least. Even if alot of the settings are at default or low, and WooHoo no power cnctr.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
March 10, 2003 3:03:15 AM

Freelancer requires directx9. I dunno if it's for the new graphical features or not though.
March 10, 2003 4:34:23 AM

Eden: the reason the TI4600 does "ok" in doom3 is because its geometry engine is roughly half the speed of the R300 (9500pro, 9700). Doom no longer relies on fillrate anymore (where the 9500pro is a little slower than the Ti4600) it relies on detailed lighting effects and high geometry. this is where the 9500pro is so much more powerful than the TI4600.. theres just no games that take advantage of it yet. im sure of it..


and Spitfire. Aquanox is a game. where do you get your facts man?
March 10, 2003 4:38:53 AM

and in regards to DX9 being here: i totally agree


most games dont take full advatage of DX7 effects yet!! show me a game that extensively uses environment bump mapping. i mean where you can see it well and really makes the game look good, because EBM looks totally incredible, and most cards render it fast. my radeon 7200 can at 1024x32 at oveer 110fps ffs in 3dmark2001


i think they need to work on gettin things down pat in DX8 before going onto DX9.

or, maybe this fast progression is good? maybe we will see DX10 in 6 months? with optimizations that suprass DX9 in speed and visual quality.. if you think of the difference btween DX7 and 9, its quite amazing.
March 10, 2003 10:52:12 AM

>>One thing I fear, is the R9500, being less performing than a Ti4600 in normal modes, will automatically go down in Doom III.<<
definitely not. geforces suck in doom3 compared to what radeons are able to do. the slower radeon8500 can about catch up in doom3, because of the ps1.4, and the r300 based cards rock in doom3, thats fact. he uses every feature of every card, as he uses opengl with all extensions, and he codes for the best of each card. gf4 will be able to run doom3 okay, r300 chips will be able to rock doom3 acceptable, and newest cards now will be able to play it at really high res. doom3 takes all advantages of all new hw.

sure, you can't run like you could on a 9700pro, as the card is slower. but you can run it bether (means qualitative and quantitative bether) than on a gf4.

"take a look around" - limp bizkit

www.google.com
a b U Graphics card
March 10, 2003 10:57:11 AM

Freelancer doesn't require DX9 for minimum, my roomates been playing it for a few days (demo I think) on his 9000 NON-pro and even still after seeing it I wouldn't really make it a priority title. I'm sure some will find it entertaining.
I think Freelancer may be a DX9 'enhanced' or enabled game, but it's not required. You just may need to download the DX9 instruction set so it knows what to do with the DX9 code, even though it doesn't really use it.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
March 10, 2003 11:02:22 AM

>>most games dont take full advatage of DX7 effects yet!! show me a game that extensively uses environment bump mapping.<<
problem: gf1/gf2 are called dx7 cards but just can't do envbump. and as they where marked-leading cards, most selled, the effort to addititonally use envbump in a game, for just some visual gain for just some hw nearly no one has was just not it.

main trick is: dx9 makes complex effects very easy, wich was not the case for dx7 for example, and only partially for dx8 (and people just had to learn about the way dx8 was designed, very new for about all gamedev that was). now, gamedev start to understand dx8, and can directly move to dx9 wich is very similar to use, just has less restrictions than dx8 had, namely all features got more advanced, evolved, mature.

and dx is never about speed, people. hw and drivers is about speed. dx is about features.

"take a look around" - limp bizkit

www.google.com
a b U Graphics card
March 10, 2003 11:05:00 AM

I disagree, it's been a long time coming for DX9 and part of that has kept the GPU mfrs from bringing out more DX9 compliant cards. I don't need gamers to use ALL the bells and whistles in a version of DX-x before bringing out the next version. 'Game X finally used the last of the features in DX7, it's time to bring out DX8'. No It's a set in which some of the DX9 features will even benifit people with DX7-only compliant cards. I understand what you are saying, but think of DX as being the road. I'd prefer they build the roads to handle cars that go 400 mph even if they aren't here yet, and not have to wait for the roads one the hardware is here. I don't want to wait much longer than I already do to take advantage of the features int he hardware I just spent a pretty shiny new penny on. IMO.
There used to be a DX release about every 6-9 monts. It's been what 2 years since the release of DX8?

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
March 10, 2003 11:05:45 AM

Quote:
the reason the TI4600 does "ok" in doom3 is because its geometry engine is roughly half the speed of the R300 (9500pro, 9700). Doom no longer relies on fillrate anymore (where the 9500pro is a little slower than the Ti4600) it relies on detailed lighting effects and high geometry. this is where the 9500pro is so much more powerful than the TI4600.. theres just no games that take advantage of it yet. im sure of it..


no. doom3 does heavily rely on fillrate, and only fillrate. but the power of the capabilities of a 9500 in the pixelshader makes it possible for carmack to do the same lighting effects in much less passes, and each pass costs a full screen of pixels => tons of the limited fillrate. thats why a 9500 will win over the gf4. the same as in 3dmark03. ps1.4 hw is much more powerful than ps1.1 - ps1.3 hw. ps2.0 hw is even much more powerful. and gf4 is definitely outdated in those, modern topics.

"take a look around" - limp bizkit

www.google.com
March 10, 2003 7:11:20 PM

Lemme ask you: those scores by nVidia showing the Ti4600 at what, 30FPS, and the R9700PRO at 45FPS, being nvDemos which stress cards, where do you see the R9500PRO setting in?
Also, seeing as it's a stress test, how much do you assume actual frame rate on average would be seen on the R9500PRO, after GUESSING the performance delta from the R9700PRO to it? I realize there is only the alpha Doom III out there, and you seem to barely hit over 30FPS with your system (Celeron P4 maybe?), so I wonder in the actual version (which will likely have performance up to 50% better in final version) what can we expect from these DX9 cards. I think this is where it matters in fact. Whether these cards were equipped with this technology for a purpose, or just to draw nice graphics and absolutely suck when it comes to performance, because the DX9 saga is in fact much more demanding than the current cards can offer.

Also, speaking of which, seeing that a Quadro's extended OpenGL capabilities can lead to very advanced OpenGL performance, leads me to easily believe that indeed as you said, cards are barely using their potential at all. It seems to me a 12GB/sec bandwidth Quadro FX, is able to have a significant boost in performance over the late 980XGL, and if we compare it to its original FX5800 Ultra sibling, we can see that it's all about untapped potential. I don't know if the Quadro or any professional OpenGL card performs the same in OpenGL games, so that gaming in games like Doom III, Quake III would have frame rates that are astronomical. If that is the case, this does prove that it's all a matter of drivers being tweaked for the programming platform. I wish that were the case for Direct 3D as well, as it would yeild over twice the FPS in games simply because it's now a "professional" card.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
March 10, 2003 7:34:48 PM

hm.. i have more than 30fps.. depends on the scene.. but my bottleneck is always the ram (you can see my hd-light flashing like a stroboscope in a trance-disco with hard-trance playing:D ). i only have 256mb ddrram, and the demo uses up to 1gig of ram just for the music (uncompressed wav, 6 at a time for dolbisurround.. thats huge:D )

at the moment my hd is not blinking, its very very fast.. too bad thats nearly never happening:D 

and actually, the demo was not yet coded for r300 chips, essentially running in 8500 mode.. so it can just get (much) bether.

and the bether image quality is reason number one anyways (you never get that impressive specular highlights on a gf4 than you get on my card..)

"take a look around" - limp bizkit

www.google.com
March 10, 2003 7:54:45 PM

Ouch man, 256 is damn low. Under WinXP, I even feel 512 being limiting in new games like Unreal II, due to the fact WinXP is bloated to hell with memory eating services and programs.

What about your CPU? You said you run a cheap Celeron P4, right?
And hey, if you could buy an R300, you can settle for an extremly cheap 256MB stick now!

Also where did you get music? I tried the alpha and had no music whatsoever, it's only sounds.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
March 11, 2003 7:42:57 AM

Quote:
Ouch man, 256 is damn low. Under WinXP, I even feel 512 being limiting in new games like Unreal II, due to the fact WinXP is bloated to hell with memory eating services and programs.

running win2000, but xp works as well, uses only 64mb ram by default, rest is for me, its okay to work. unreal2 works very well as well..

Quote:
What about your CPU? You said you run a cheap Celeron P4, right?

yes, and i want a p4 3.06gig with ht:D  problem is i don't have any 2nd level cache, according to all different benchmarkers/system checkers.. thats boring:D 
Quote:
And hey, if you could buy an R300, you can settle for an extremly cheap 256MB stick now!

they are not extremely cheap here, and i waited 3 years for the r300, so for me it was not just another card, but a quite big thing. next thing for me is an xpc from shuttle.. i had to take the pc to my friend too often the last days so we had music and videos there.. wanna have an xpc now..

Quote:
Also where did you get music? I tried the alpha and had no music whatsoever, it's only sounds

well, yes.. i ment sounds:D  but so what? they eat up tons of ram.. when i delete them it runs all much much bether:D 

"take a look around" - limp bizkit

www.google.com
March 11, 2003 6:46:17 PM

The Celeron Pentium 4 has 128KB L2, a small FYI.

Hmm I didn't know you were actually holding on before buying a new card, therefore I assume you saved a lot to it, and are not exactly the kind that gets a lot of side income for computer parts monthly!

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
March 12, 2003 8:27:41 AM

i know it has 128k, it just doesn't .. show them up .. anywhere (dunno if it uses them:D )

nope, definitely not the one buying a new pc each month. i lived before with a p3 500mhz and a gf2mx over about 4 or 5 years, don't remember actually, and i actually only had it because in the "media markt", they had an anyversary and sold there some p3 500 for nearly nothing, and everything in.. usb, 17" screen, an ati rage 128 (and i still have it, and [-peep-] it works and rocks:D  never had problems with it:D ), on board sound, etc.. the pc was actually crap, shitty mobo, etc messed everything up by quite a bit..

now i've upgraded, together with the radeon9700pro wich finally came out (the card i've waited for since the announcement of gf1 (because the stuff nvidia talked there about was the stuff i wanted, but only today the radeon9700 can for the first time provide such stuff really)), i upgraded my system..
256mb ddrram, a cheap sis mainboard, a celeron2gig (hey, its a nocked down p4.. and i can overclock it! hehe 2.34gig it runs at), a dvd rom from an old friend, the cheapest cd burner in the shop, and, of course, the radeon. by the time my hd [-peep-] up, so i buyed a new one, too.. this time 60gb.

no, dude, i'm definitely not the each-month-a-new-pc..

my next dream is an xpc with an opengl2.0 compatible gpu in (r400 is that then, i think..) and a p4 with hyperthreading..

"take a look around" - limp bizkit

www.google.com
March 12, 2003 8:00:31 PM

Geeezz Uzzzz wtf is up with you guys? What do you guys expect not to have any new technology released for any vid cards? How are programs going to program a card if DX9 is not on any card? The way you guys bitch about new technology such as dx9 and how it not useful makes you sound like a bunch crybabies. If it was up to you guys we still be programming in dos!

If you don't like how it performs then DON'T BUY IT! It's that simple. To sit here all day and whine about how no game uses dx9 is stupid. I look forward to new technology and don't look back you should do the same.


Jeff


<P ID="edit"><FONT SIZE=-1><EM>Edited by jeffg007 on 03/12/03 05:02 PM.</EM></FONT></P>
March 13, 2003 12:17:21 AM

your L2 cache isn't showing up? for one make sure it's enabled in the bios (you probably already have)
for another, go to this <A HREF="http://www.tweaktown.com/document.php?dType=guide&dId=1..." target="_new">windows xp tweaking guide 2</A>
i know you're running win2k, but the L2 cache fix might work for you as well, depending on how similar the registries are. it's something to check out

--------------
I LOVE DANGER DEN WATERCOOLING, they went out of their way to both personalize my kit and change my order when i needed to, i had to change my sig to give them props
March 13, 2003 12:33:33 AM

You seem to really have jumped and read between the lines because my intention in my main post was definitly not about DX9 not being available.

Damn, it's like whatever I say never goes through anyone who protests like this. Please, READ through once more.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
March 13, 2003 12:36:24 AM

One more question and thing I have noticed. The R300 is truly not optimized or running efficiently when it comes to Vertex Shader performance. In reality, games who use VS' would have the card stand out at least 50% above the Ti4600.
Think about it, it has 4 VS'! One proof I have, is the 3dM 01 Vertex Shader speed. You can see a huge gap between it and the Ti4600. IIRC the SharkMark by Matrox also showed such a huge helping. Then I look at the Vertex Shader test in 3dM 03 and feel like the card runs crap in it. Do you think that it is not using its 4 VS' properly or at max capacity?

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
March 13, 2003 8:45:07 AM

hm great, i'll try it when i'm at home, lets see if it helps.. wish me luck

thanks

"take a look around" - limp bizkit

www.google.com
!