ATI 2900XT 1GB and half life 2 not DX10?

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
The good news, I'm getting unbelievable frame rates in Half Life 2 Episode 2 with EVERYTHING maxed out at 1920 x 1080 with 4XAA and 8XAF -- texture detail is amazing.

However, the bad news, the DX drop down shows DX9+ and it is grayed out?? Is the game only available in DX9 not DX10 for ATI 2900XT 1GB cards? I'm confused, I was pretty sure Half Life 2 Episode 2 was DX10. It does look amazing (better than Crysis MP Beta) and frame rate is dead solid a 60 fps (Vsync locked) -- I'm getting high quality shadows, the flash light is working fine (contrary to review reports), haze, light glow from gun, reflections, etc. etc. but NO motion blurr (hmmm...this is OK since I don't like motion blurr, but that might indicate DX9) -- it feels like a DX10 title so I'm hoping the graphics options showing DX9+ is just a mistake?? But there again, I have a hard time distinguishing between DX9c and DX10 titles anyway.

Finally, a title that seems to work better on an ATI card than nVidia -- it's a first. I was expecting the card to start chunking on the open scenes, but it didn't miss a beat, rock solid 60 fps. But more importantly, it proves that performance can be had with these 2900XT 1GB cards. But again, if this is all just DX9 stuff then this is just another DX9 title working well on a DX10 card.

Anyone else notice this? Is this due to running widescreen 16:9 ??

(running Vista x64 Cat 7.10)
 

rayzor

Distinguished
Apr 24, 2004
353
0
18,790
I am pretty sure that the source engine is still only dx9 capable. And I find it incredibly hard to believe that source engine is superior in image quality to cryengine2. And also, ATI always had a significant edge in Source games, so that could explain something. Of course, my 8800 GTX can run it w/ everything completely maxed at ultra high res too...
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Half-Life 2 is still strictly Direct X9, and doesn't even require Shader Model 3.0 for it's effects.
 

NightlySputnik

Distinguished
Mar 3, 2006
638
0
18,980
Unfortunatly for all of us, HL2 and add-on are "only" Dx9.0C at best.

But if next history prove right, this is a good thing. Valve weren't the first on the market with a Dx9 engine. But when they arrived, their engine was more efficient then anything elses available at the time, at it is almost still the case. Look at those graphic and the fps at high resolution and you'll understand ;).

I think Valve is most probably developping a Dx10 engine, one efficient as well as good looking one, right now. They might pull it out for a future Half-Life 3 (who knows) that will bring tha game to a new level from where it is now. You know, one big release followed by numerous episode.

That might just be wishfull thinking, but dreaming cost nothing... does it???

Anyway, for now, the engine is Dx9.0C and nothing more. It would imply too much modification to be brought to Dx10. They'd better start anew if they want something as efficient without being too big a download.

My 2 cents.
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
Bizarre, my font size change at Toms and only Toms web site -- switched to micro point...grrrr -- can't get font size back...

Anyway, I'll post some screen shots of both Crysis and Half Life -- under ATI running Half Life 2 is considerably "prettier" and faster -- I know nVidia looks better and performs better under Crysis MP Beta.

But after some reading I see it "supports some features of DX10, but isn't actually DX10" -- interesting.

http://enthusiast.hardocp.com/article.html?art=MTQwNCwxLCxoZW50aHVzaWFzdA==

So yeah, this is not a DX10 title which would explain why the 2900XT 1GB is working so well and out performing my nVidia 8800GTX card. Dang, it looks very impressive for DX9c title, certainly better than any DX10 title I've tried.

One good title running better on ATI does NOT make for a mass exodus, but it is a proof of concept that the card does have muscle -- beginning to think the real struggle in video card industry is how well nVidia or ATI do getting in bed with the game developers to show case their technology. This is the first new game I've seen recently where they display the "Best played on ATI logo".

Anyway, chalk one up for ATI.

 

stemnin

Distinguished
Dec 28, 2006
1,450
0
19,280
I can't make any sense of what you're saying and all the benchie i've seen comparing XT to GTX. Motion blur is not a DX10 only thing.. xbox360 titles have it for ---- sakes. I sure hope to whoever that a 1GB video card that costs double my 320MB plays atleast 100 fps (min what I get) in the 3 year old source engine.
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
XBOX360 -- not the same, similar but not really the same. Many of the new DX10 titles are introducing motion blurr (over introducing) because the hardware and new DX10 functions lend well to acceleration of this process/FX. Motion blurr can be done in DX9c (actually there is no limit to what you can do if you want a slide show) but isn't found in many PC titles because there is limited accelerated support for it.

Not sure what you are saying either, where did the cost come into play?? This isn't a thread about cost and bang for the buck (why do some many folks always divert a thread down this road??) -- was never mentioned at all. So please, no need for you to justify your purchase/video card.

It's one title (and as far as I can tell) the only title that is running faster on my 2900XT 1GB card over my nVidia 8800GTX 768MB card. But the original comment was about HL2 ep 2 still being DX9c and doing a VERY good job with DX9c -- on a visual level I think it is far better than Crysis MP Beta -- but that is still beta.

What has surprised me is that this is a proof that 2900XT can indeed show some amazing muscle with prestine visual quality -- but it does indeed appear that ATI must have worked very closely with Valve to make it that way. Just as nVidia work very closely with other gaming companies. It almost seems like this is a battle behind the scenes -- as in who can convience the game developers to code in such a way that it leverages XYZ's (XYZ being nVidia or ATI) hardware.

I just find it interesting that it can be done. Seeing how nVidia's next gen graphics card is going to be closer in design and specifications to ATI current gen, there might be more to this story as time progresses.

Now, what I would like to see in the future is graphics boxes external to the PC with a high speed link (which I believe has been prototyped already) -- switching vendors would as easy as selecting the external graphics box as primary for 3D games/apps (via a Control panel). I beleive preliminary DX11 specs already have the necessary support for this. That way we can use both "solutions" (aka ATI and nVidia) that best work with the game title one wants to play...now that would be cool.
 

hannibal

Distinguished


So we only have to have two highend graphic cards... nice...
It would mean like having two separate game consoles. Half life 5 is only playable with ATI and Doom 5 is only with nVidia, if it goes to extreme. But you have a point! The gpu-maker that has more support via game engine makers is winner! And because there are support to one or another, the luser is user...

Seems to be that UT3 is again more ATI favored game... Along many Nvidia titles. So Nvidia has advantage because of earlier DX10 card release (or better pr division toward game houses!)

http://www.anandtech.com/video/showdoc.aspx?i=3128&p=4

 

stemnin

Distinguished
Dec 28, 2006
1,450
0
19,280
The Source engine prefers ATI cards. I wasn't trying to compare on a bang for the buck, I was trying to say there's no muscle to flex on a 3 year old engine (even with ep2 engine modifications). It only requires a DX8 card lol.
 
Yeah I have a Sapphire 2900XT 512MB and it played Episode 2 well..... twice. As for DX10, no. Although for this installment Source dropped DX8.1 support, so now only DX9.0 and higher is supported. This game was fantastic, loved it and the battle at the end is an awesome heart-pounding affair. Portal is cool too.

Anyone want to make a bet that you get the Portal gun from Portal in HL2 Episode 3 when you get to the Borealis. My reasoning and maybe many of you have figured this out as well is: in Portal you are in Aperture Science Labs using the Portal Gun. Who did Maggnusson work for? (Aperture Labs) and Borealis was one of Aperture Labs projects. Seems to me that the two will come together next episode. So we can expect not only physics puzzles, but portal puzzles as well, or maybe even the two combined :bounce: . Of course this is a little off topic, sorry, just got a little excited.
 


Not without work it didn't, initially it favoured nV;
http://www.xbitlabs.com/articles/video/display/radeon-hd-2900-games_7.html#sect0

then after driver updates, the story changed;
http://www.xbitlabs.com/articles/video/display/diamond-viper-hd2900xt-1024mb_12.html#sect0

I wasn't trying to compare on a bang for the buck, I was trying to say there's no muscle to flex on a 3 year old engine (even with ep2 engine modifications). It only requires a DX8 card lol.

Doesn't matter what it requires, that it's still as good looking as it is using SM2.0 and not even really needing SM3.0, shows it had more than enough muscle at launch. People love to talk about how it's not SM3.0 with OpenEXR HDR, but really it still holds its own against many games that require minimum SM3.0, that it let's you play it on a Radeon 8500 / GF4ti is pretty impressive, not something to be laughed at (as if any of those 'SM3.0 only games couldn't do it with SM2.0 :heink: ) .
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
Hannibal,

Unless Microsoft make some serious changes to DX qualification rules -- but they seem unwilling to do that beyond a superficial level and ATI/nVidia don't seem to want that either -- only the end consumer wants it and we all know what happens to the end consumer...their well...at the End. ;)

Of course two PC's could be another option...or maybe even Matrox...so three PC's (joking). But seriously, make the graphics hardware a separate plug and play unit -- considering the cooling and power requirements of current video cards it would seem like a better solution - just the physics of making a high speed bus over specified length.

Stemnin,

DX9, no DX8 support. But does that matter? It's the best game so far (including all DX10 titles) in terms of graphics FX and beauty and most importantly has very good frame rates (like I said solid at 60 fps for me with everything cranked up at 1920 x 1080).
 
Yep I'd love to see a lasso type external PCIe solution, but like you just curious about the latency issues for cable length.

It'd also be awesome for laptop owners though so I could buy a slim penryn based quad, and have an external GF100/R700.
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
Multiple fiber channels perhaps?

I know this is catching on with high end DAWs where audio processing is being off loaded to dedicate hardware in real time using a "lightpipe" -- latency is a big big issue with audio especially when you have so many stages in the audio processing chain.

Just think of the overclocking possibilities for an external GPU...
 
HL2: E2 may not have DX10 support but it really doesn't need it. DX9.0c still has some great uses and Valve has shown that.

The Source Engine is very impressive for it being 3 years old and with the upgrades you can really see the difference. If you compare the Vortagauns in Episode 2 to the ones in Episode 1 or HL2 they are much better looking.

Not to even mention the physics effects are on par with what I have seen from Crysis. Especially when you get the lvl 2 gravity gun. I love just ripping the consoles off the walls and hurling them into Combine.

Cryengine2 may have DX10 and SM3.0 but there are still some things that Source has that beats Crysis in. For one the facial details are so far unmatched in HL2. I swear that some characters look so real and the facial expressions such as smiling, frowning or shock look great.

I love that I was able to run HL2 even up to Episode 2 on a 4 year old computer. I'm talking P4 3.4EE, 2GB DDR, AGP 8x X850XT. And running it at high resolutions with everything maxed out. Crysis can't do that. Even the minimum are past most of that.

Either way I don't see a need for Source engine to run on DX10. I wouldn't mind it as it would add even more graphic wise to an already great looking game. It could even make the G-Man creepier than he already is.
 


Yep, but I think that's a ways off. Not sure that even the audio latency demand is near the requirements we're talking about here ( ns vs ms ), I haven't kept up with my photonics readings enough to know the current fastest signal procs, but I'd think the double conversion would likely kill alot of the benefits since we're worried about 2meter of travel not 200+M of travel, where I think the benefit of fibre chanel would start outweight copper latency if that. That's the biggest barier now is all the conversion points. I'd have to think about it more, but I'd think you'd have to increase burst width to make up for it and on it's own remove some latency issues on the card istelf by doing more locally, which is sort of the opposite direction of desktops IMO which tries to unify resources. Doesn't seem practical yet.

Just think of the overclocking possibilities for an external GPU...

Yeah especially for those of us up north here. I could just pop my VPU out the window in -50 temps and have the display cable pop back in to my moitor and voila 50% core & VRAM boost. :sol:
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Half-Life 2's lighting engine is complete **** in comparison to Crysis' dynamic lighting with volumetric soft shadows. Hell, even the new shadow effect they added with the flashlight doesn't come close, because there is still always a shadow on the ground under characters(wtf?) andthe flashlight is the only light that can make objects cast dynamic shadows. That being said, Half-Life 2: Episode 2 does run very well, and the graphics are still pretty good for the performance it provides in comparison to many newer titles.
 

cpburns

Distinguished
Aug 28, 2006
239
0
18,680
i forget where the links are, but i read up that valve has even refused to go to dx10 for the foreseeable future. they're happy on dx9. they also bashed the ps3 in the same article. so no dx10 engine in the works for a little while more.

also, as far as i know, valve's monthly steam hardware survey is still the most effective and widespread effort of its kind. through the data collected, valve has determined that the majority of their players still don't have dx10 capable systems, and i mean we're talking 2.5%. the most common card was something like the geforce 6600gt? so they're designing their engine with relevant data in mind. they're really designing it for the people who are going to be playing it.
 

hannibal

Distinguished


Yep... The cooling effect would be thing that should be considered. I by self have an external sound card (for electronical issues...). The bad thing is that it's not so easy to move the computer, and I think that external graphick boxes are not going to be very popular, so the price will be high...
But you are definitely right about the advantage of modularity and cooling. Only future will tell.
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
HeyYou,

You lost me there, because one adds another light source (the flashlight) why would another light source (the sun) go away? That doesn't happen in the real world. In the real world you have multiple shadows from multiple source and they interact with each other.

Hopefully the Crysis final version (Nov 16th) will be better than it currently is -- cause right now I'd say graphically Crysis MP Beta is a 7 out of 10 compared to HL2 at a 9 out of 10. Factor in frame rates and Crysis MP Beta is about 4 out of 10 compared to HL2 10 out 10.

As for the external GPU -- agree, I think width "might" be the solution. But on the DAW front, especially when working with 5.1 or 7.1 surround you do need ns latency. For basic audio work sure 7-10ms is ok, but when you add in multiple buses each with 4-7 FX processors, 32+ tracks and then you have Virtual Instruments, you soon find a need for ns latency hence why "lightpipe" external audio processors exist. Sure this is very high end, but so would this be for a graphics solution.

Converters would be a big latency introducer, unless of course we got to a point where conversion was no longer needed -- now that is probably way out in the future 10-20 years maybe more.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
No matter the scene, characters will still have the same shadow directly under them; it's as if they each have a light right above them. The method Valve uses is vastly inferior to Crytek's, and most developers wouldn't use it with their "Next-Gen" games.

Half-Life 2: Episode 2
http://img248.imageshack.us/img248/3439/hl22007101913304870sz0.jpg
http://img91.imageshack.us/img91/8871/hl22007101913352659il3.jpg
http://img337.imageshack.us/img337/9428/hl22007101913342887gt7.jpg
http://img256.imageshack.us/img256/7861/hl22007101913320382zb4.jpg

Crysis MP Beta:
http://img86.imageshack.us/img86/2299/crysis2007101913490990iq6.jpg
http://img153.imageshack.us/img153/3134/crysis2007101913492071sq2.jpg
http://img504.imageshack.us/img504/1098/crysis2007101913503771hj8.jpg
http://img148.imageshack.us/img148/6109/crysis2007101913512590cs9.jpg

Even though Half-Life 2: Episode 2 isn't groundbreaking on a graphical level, I still love the game.
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
I see what you're saying, have to agree with that. But on a texture level, Crysis MP Beta appears washed out to me. And the chickens and frogs don't add much to the experience other than reduce frame rates even more.

But if you really wanna pick on details Crytek's "wind" isn't really wind -- or at least I don't see any real world emulation there, just the stuff moving back and forth -- no real wind direction or speed.

 
You want wind effects or any sort of weather effects, then Oblivion is great for those. I remember when it had a major thunderstorm and the trees were swaying together and the rain was being thrown in various directions. It was crazy really.

I for one would hope thet Crysis DX10 would have better shadows since DX10 enhances and enables higher resolution shadows. But still even without DX10 HL2 has a lot of things Crysis does not. The way people react to you when you shine a flashlight in their faces. Oh and Crysis will never have the G-Man. No one will.
 
You miss V8V's point, you can make stuff look kinda windy, but even Oblivion the wind is still not wind. I love Oblivion, but if either game were able to do true wind effects, like increase resistance against your walking, change the arch of arrows or bullets, blow over objects or make debris fall with the wind, that would be awesome, but also require the power of a PPU or VPU-physics IMO.

Anywhoo, I think that level of realism is still a long way off.
 

Rabidpeanut

Distinguished
Dec 14, 2005
922
0
18,980
The fact that they have still left wind out of games makes me laugh. Total Annihilation had wind and that was in 1997, and it had a physics engine and it was true 3d on a bmp sneakily disguised as a texture. Old games were so much better. and along came source and said "first ever physics, w007 li3z''

I have not had any problems with hl2ep2, which looks much better than oblivion (badly made, badly textured, and just generally bad if you actually at closer to a wall than 1 meter.) The snow is not even volume dependent it is just a bad overlay and so is the rain.

Crysis will definitely be a lot better looking, the beta is just the beta, crysis is going to feel real, half life 2 episode 2 feels like a smoother half life 2, and thus a less real half life 2. The floors are not nearly as well textured as those in crysis, but overall it is very well polished and for the low requirements it looks pretty damn good. As far as i can see crysis will take a LOT more power to run cause in hl2 ep2 you get a LOT and i mean a LOT of object fade, like in the leafless dead trees disappear at 50 meters or so, whereas crysis keeps your trees(or turns them into crappy flat textures) at a much larger distance. They were pretty sneaky with half life, they let you go outside, but it is always in a valley and you never get to see very far, though you dont need to buy a new pc for it at least, but also it is like 3.5 hours long so i think crysis will wipe the floor with it, also, if i EVER get denied a combine sniper rifle EVER again i will personally march down to steam and kick all of the developers in the nuts. hl2dm is getting very old very fast, and q3 is dead, so i need a new multiplayer game, i am hoping crysis will fill this multiplayer void.

Oh i think you are having problems with your card cause it is the 1gb version, it gets even less attention than the 512, which is quite a lack of attention. And if you think you are missing out on the motion blur you might just not be seeing it cause it is very slight(and thus not intrusive and thus hard to see unless you flick your mouse) hope they tweak that in crysis multi.