Sign in with
Sign up | Sign in
Your question

ATI 2900XT 1GB and half life 2 not DX10?

Last response: in Graphics & Displays
Share
October 18, 2007 3:52:05 PM

The good news, I'm getting unbelievable frame rates in Half Life 2 Episode 2 with EVERYTHING maxed out at 1920 x 1080 with 4XAA and 8XAF -- texture detail is amazing.

However, the bad news, the DX drop down shows DX9+ and it is grayed out?? Is the game only available in DX9 not DX10 for ATI 2900XT 1GB cards? I'm confused, I was pretty sure Half Life 2 Episode 2 was DX10. It does look amazing (better than Crysis MP Beta) and frame rate is dead solid a 60 fps (Vsync locked) -- I'm getting high quality shadows, the flash light is working fine (contrary to review reports), haze, light glow from gun, reflections, etc. etc. but NO motion blurr (hmmm...this is OK since I don't like motion blurr, but that might indicate DX9) -- it feels like a DX10 title so I'm hoping the graphics options showing DX9+ is just a mistake?? But there again, I have a hard time distinguishing between DX9c and DX10 titles anyway.

Finally, a title that seems to work better on an ATI card than nVidia -- it's a first. I was expecting the card to start chunking on the open scenes, but it didn't miss a beat, rock solid 60 fps. But more importantly, it proves that performance can be had with these 2900XT 1GB cards. But again, if this is all just DX9 stuff then this is just another DX9 title working well on a DX10 card.

Anyone else notice this? Is this due to running widescreen 16:9 ??

(running Vista x64 Cat 7.10)
October 18, 2007 4:10:49 PM

I am pretty sure that the source engine is still only dx9 capable. And I find it incredibly hard to believe that source engine is superior in image quality to cryengine2. And also, ATI always had a significant edge in Source games, so that could explain something. Of course, my 8800 GTX can run it w/ everything completely maxed at ultra high res too...
October 18, 2007 4:19:10 PM

V8VENOM said:
The good news, I'm getting unbelievable frame rates in Half Life 2 Episode 2 with EVERYTHING maxed out at 1920 x 1080 with 4XAA and 8XAF -- texture detail is amazing.

However, the bad news, the DX drop down shows DX9+ and it is grayed out?? Is the game only available in DX9 not DX10 for ATI 2900XT 1GB cards? I'm confused, I was pretty sure Half Life 2 Episode 2 was DX10. It does look amazing (better than Crysis MP Beta) and frame rate is dead solid a 60 fps (Vsync locked) -- I'm getting high quality shadows, the flash light is working fine (contrary to review reports), haze, light glow from gun, reflections, etc. etc. but NO motion blurr (hmmm...this is OK since I don't like motion blurr, but that might indicate DX9) -- it feels like a DX10 title so I'm hoping the graphics options showing DX9+ is just a mistake?? But there again, I have a hard time distinguishing between DX9c and DX10 titles anyway.

Finally, a title that seems to work better on an ATI card than nVidia -- it's a first. I was expecting the card to start chunking on the open scenes, but it didn't miss a beat, rock solid 60 fps. But more importantly, it proves that performance can be had with these 2900XT 1GB cards. But again, if this is all just DX9 stuff then this is just another DX9 title working well on a DX10 card.

Anyone else notice this? Is this due to running widescreen 16:9 ??

(running Vista x64 Cat 7.10)
Half-Life 2 is still strictly Direct X9, and doesn't even require Shader Model 3.0 for it's effects.
Related resources
October 18, 2007 4:31:44 PM

Unfortunatly for all of us, HL2 and add-on are "only" Dx9.0C at best.

But if next history prove right, this is a good thing. Valve weren't the first on the market with a Dx9 engine. But when they arrived, their engine was more efficient then anything elses available at the time, at it is almost still the case. Look at those graphic and the fps at high resolution and you'll understand ;) .

I think Valve is most probably developping a Dx10 engine, one efficient as well as good looking one, right now. They might pull it out for a future Half-Life 3 (who knows) that will bring tha game to a new level from where it is now. You know, one big release followed by numerous episode.

That might just be wishfull thinking, but dreaming cost nothing... does it???

Anyway, for now, the engine is Dx9.0C and nothing more. It would imply too much modification to be brought to Dx10. They'd better start anew if they want something as efficient without being too big a download.

My 2 cents.
October 18, 2007 4:34:39 PM

Bizarre, my font size change at Toms and only Toms web site -- switched to micro point...grrrr -- can't get font size back...

Anyway, I'll post some screen shots of both Crysis and Half Life -- under ATI running Half Life 2 is considerably "prettier" and faster -- I know nVidia looks better and performs better under Crysis MP Beta.

But after some reading I see it "supports some features of DX10, but isn't actually DX10" -- interesting.

http://enthusiast.hardocp.com/article.html?art=MTQwNCwx...

So yeah, this is not a DX10 title which would explain why the 2900XT 1GB is working so well and out performing my nVidia 8800GTX card. Dang, it looks very impressive for DX9c title, certainly better than any DX10 title I've tried.

One good title running better on ATI does NOT make for a mass exodus, but it is a proof of concept that the card does have muscle -- beginning to think the real struggle in video card industry is how well nVidia or ATI do getting in bed with the game developers to show case their technology. This is the first new game I've seen recently where they display the "Best played on ATI logo".

Anyway, chalk one up for ATI.

October 18, 2007 4:49:49 PM

I can't make any sense of what you're saying and all the benchie i've seen comparing XT to GTX. Motion blur is not a DX10 only thing.. xbox360 titles have it for ---- sakes. I sure hope to whoever that a 1GB video card that costs double my 320MB plays atleast 100 fps (min what I get) in the 3 year old source engine.
October 18, 2007 6:19:31 PM

XBOX360 -- not the same, similar but not really the same. Many of the new DX10 titles are introducing motion blurr (over introducing) because the hardware and new DX10 functions lend well to acceleration of this process/FX. Motion blurr can be done in DX9c (actually there is no limit to what you can do if you want a slide show) but isn't found in many PC titles because there is limited accelerated support for it.

Not sure what you are saying either, where did the cost come into play?? This isn't a thread about cost and bang for the buck (why do some many folks always divert a thread down this road??) -- was never mentioned at all. So please, no need for you to justify your purchase/video card.

It's one title (and as far as I can tell) the only title that is running faster on my 2900XT 1GB card over my nVidia 8800GTX 768MB card. But the original comment was about HL2 ep 2 still being DX9c and doing a VERY good job with DX9c -- on a visual level I think it is far better than Crysis MP Beta -- but that is still beta.

What has surprised me is that this is a proof that 2900XT can indeed show some amazing muscle with prestine visual quality -- but it does indeed appear that ATI must have worked very closely with Valve to make it that way. Just as nVidia work very closely with other gaming companies. It almost seems like this is a battle behind the scenes -- as in who can convience the game developers to code in such a way that it leverages XYZ's (XYZ being nVidia or ATI) hardware.

I just find it interesting that it can be done. Seeing how nVidia's next gen graphics card is going to be closer in design and specifications to ATI current gen, there might be more to this story as time progresses.

Now, what I would like to see in the future is graphics boxes external to the PC with a high speed link (which I believe has been prototyped already) -- switching vendors would as easy as selecting the external graphics box as primary for 3D games/apps (via a Control panel). I beleive preliminary DX11 specs already have the necessary support for this. That way we can use both "solutions" (aka ATI and nVidia) that best work with the game title one wants to play...now that would be cool.
October 18, 2007 7:09:25 PM

V8VENOM said:
Now, what I would like to see in the future is graphics boxes external to the PC with a high speed link (which I believe has been prototyped already) -- switching vendors would as easy as selecting the external graphics box as primary for 3D games/apps (via a Control panel). I beleive preliminary DX11 specs already have the necessary support for this. That way we can use both "solutions" (aka ATI and nVidia) that best work with the game title one wants to play...now that would be cool.


So we only have to have two highend graphic cards... nice...
It would mean like having two separate game consoles. Half life 5 is only playable with ATI and Doom 5 is only with nVidia, if it goes to extreme. But you have a point! The gpu-maker that has more support via game engine makers is winner! And because there are support to one or another, the luser is user...

Seems to be that UT3 is again more ATI favored game... Along many Nvidia titles. So Nvidia has advantage because of earlier DX10 card release (or better pr division toward game houses!)

http://www.anandtech.com/video/showdoc.aspx?i=3128&p=4

October 18, 2007 7:22:59 PM

The Source engine prefers ATI cards. I wasn't trying to compare on a bang for the buck, I was trying to say there's no muscle to flex on a 3 year old engine (even with ep2 engine modifications). It only requires a DX8 card lol.
October 18, 2007 7:59:24 PM

Yeah I have a Sapphire 2900XT 512MB and it played Episode 2 well..... twice. As for DX10, no. Although for this installment Source dropped DX8.1 support, so now only DX9.0 and higher is supported. This game was fantastic, loved it and the battle at the end is an awesome heart-pounding affair. Portal is cool too.

Anyone want to make a bet that you get the Portal gun from Portal in HL2 Episode 3 when you get to the Borealis. My reasoning and maybe many of you have figured this out as well is: in Portal you are in Aperture Science Labs using the Portal Gun. Who did Maggnusson work for? (Aperture Labs) and Borealis was one of Aperture Labs projects. Seems to me that the two will come together next episode. So we can expect not only physics puzzles, but portal puzzles as well, or maybe even the two combined :bounce:  . Of course this is a little off topic, sorry, just got a little excited.
October 18, 2007 8:10:10 PM

stemnin said:
The Source engine prefers ATI cards.


Not without work it didn't, initially it favoured nV;
http://www.xbitlabs.com/articles/video/display/radeon-hd-2900-games_7.html#sect0

then after driver updates, the story changed;
http://www.xbitlabs.com/articles/video/display/diamond-viper-hd2900xt-1024mb_12.html#sect0

Quote:
I wasn't trying to compare on a bang for the buck, I was trying to say there's no muscle to flex on a 3 year old engine (even with ep2 engine modifications). It only requires a DX8 card lol.


Doesn't matter what it requires, that it's still as good looking as it is using SM2.0 and not even really needing SM3.0, shows it had more than enough muscle at launch. People love to talk about how it's not SM3.0 with OpenEXR HDR, but really it still holds its own against many games that require minimum SM3.0, that it let's you play it on a Radeon 8500 / GF4ti is pretty impressive, not something to be laughed at (as if any of those 'SM3.0 only games couldn't do it with SM2.0 :heink:  ) .
October 18, 2007 8:18:49 PM

Hannibal,

Unless Microsoft make some serious changes to DX qualification rules -- but they seem unwilling to do that beyond a superficial level and ATI/nVidia don't seem to want that either -- only the end consumer wants it and we all know what happens to the end consumer...their well...at the End. ;) 

Of course two PC's could be another option...or maybe even Matrox...so three PC's (joking). But seriously, make the graphics hardware a separate plug and play unit -- considering the cooling and power requirements of current video cards it would seem like a better solution - just the physics of making a high speed bus over specified length.

Stemnin,

DX9, no DX8 support. But does that matter? It's the best game so far (including all DX10 titles) in terms of graphics FX and beauty and most importantly has very good frame rates (like I said solid at 60 fps for me with everything cranked up at 1920 x 1080).
October 18, 2007 10:10:10 PM

Yep I'd love to see a lasso type external PCIe solution, but like you just curious about the latency issues for cable length.

It'd also be awesome for laptop owners though so I could buy a slim penryn based quad, and have an external GF100/R700.
October 18, 2007 11:08:12 PM

Multiple fiber channels perhaps?

I know this is catching on with high end DAWs where audio processing is being off loaded to dedicate hardware in real time using a "lightpipe" -- latency is a big big issue with audio especially when you have so many stages in the audio processing chain.

Just think of the overclocking possibilities for an external GPU...
October 19, 2007 12:04:32 AM

HL2: E2 may not have DX10 support but it really doesn't need it. DX9.0c still has some great uses and Valve has shown that.

The Source Engine is very impressive for it being 3 years old and with the upgrades you can really see the difference. If you compare the Vortagauns in Episode 2 to the ones in Episode 1 or HL2 they are much better looking.

Not to even mention the physics effects are on par with what I have seen from Crysis. Especially when you get the lvl 2 gravity gun. I love just ripping the consoles off the walls and hurling them into Combine.

Cryengine2 may have DX10 and SM3.0 but there are still some things that Source has that beats Crysis in. For one the facial details are so far unmatched in HL2. I swear that some characters look so real and the facial expressions such as smiling, frowning or shock look great.

I love that I was able to run HL2 even up to Episode 2 on a 4 year old computer. I'm talking P4 3.4EE, 2GB DDR, AGP 8x X850XT. And running it at high resolutions with everything maxed out. Crysis can't do that. Even the minimum are past most of that.

Either way I don't see a need for Source engine to run on DX10. I wouldn't mind it as it would add even more graphic wise to an already great looking game. It could even make the G-Man creepier than he already is.
October 19, 2007 1:36:44 AM

V8VENOM said:
Multiple fiber channels perhaps?

I know this is catching on with high end DAWs where audio processing is being off loaded to dedicate hardware in real time using a "lightpipe" -- latency is a big big issue with audio especially when you have so many stages in the audio processing chain.


Yep, but I think that's a ways off. Not sure that even the audio latency demand is near the requirements we're talking about here ( ns vs ms ), I haven't kept up with my photonics readings enough to know the current fastest signal procs, but I'd think the double conversion would likely kill alot of the benefits since we're worried about 2meter of travel not 200+M of travel, where I think the benefit of fibre chanel would start outweight copper latency if that. That's the biggest barier now is all the conversion points. I'd have to think about it more, but I'd think you'd have to increase burst width to make up for it and on it's own remove some latency issues on the card istelf by doing more locally, which is sort of the opposite direction of desktops IMO which tries to unify resources. Doesn't seem practical yet.

Quote:
Just think of the overclocking possibilities for an external GPU...


Yeah especially for those of us up north here. I could just pop my VPU out the window in -50 temps and have the display cable pop back in to my moitor and voila 50% core & VRAM boost. :sol: 
October 19, 2007 2:14:58 AM

V8VENOM said:
XBOX360 -- not the same, similar but not really the same. Many of the new DX10 titles are introducing motion blurr (over introducing) because the hardware and new DX10 functions lend well to acceleration of this process/FX. Motion blurr can be done in DX9c (actually there is no limit to what you can do if you want a slide show) but isn't found in many PC titles because there is limited accelerated support for it.

Not sure what you are saying either, where did the cost come into play?? This isn't a thread about cost and bang for the buck (why do some many folks always divert a thread down this road??) -- was never mentioned at all. So please, no need for you to justify your purchase/video card.

It's one title (and as far as I can tell) the only title that is running faster on my 2900XT 1GB card over my nVidia 8800GTX 768MB card. But the original comment was about HL2 ep 2 still being DX9c and doing a VERY good job with DX9c -- on a visual level I think it is far better than Crysis MP Beta -- but that is still beta.

What has surprised me is that this is a proof that 2900XT can indeed show some amazing muscle with prestine visual quality -- but it does indeed appear that ATI must have worked very closely with Valve to make it that way. Just as nVidia work very closely with other gaming companies. It almost seems like this is a battle behind the scenes -- as in who can convience the game developers to code in such a way that it leverages XYZ's (XYZ being nVidia or ATI) hardware.

I just find it interesting that it can be done. Seeing how nVidia's next gen graphics card is going to be closer in design and specifications to ATI current gen, there might be more to this story as time progresses.

Now, what I would like to see in the future is graphics boxes external to the PC with a high speed link (which I believe has been prototyped already) -- switching vendors would as easy as selecting the external graphics box as primary for 3D games/apps (via a Control panel). I beleive preliminary DX11 specs already have the necessary support for this. That way we can use both "solutions" (aka ATI and nVidia) that best work with the game title one wants to play...now that would be cool.
Half-Life 2's lighting engine is complete **** in comparison to Crysis' dynamic lighting with volumetric soft shadows. Hell, even the new shadow effect they added with the flashlight doesn't come close, because there is still always a shadow on the ground under characters(wtf?) andthe flashlight is the only light that can make objects cast dynamic shadows. That being said, Half-Life 2: Episode 2 does run very well, and the graphics are still pretty good for the performance it provides in comparison to many newer titles.
October 19, 2007 2:56:25 AM

i forget where the links are, but i read up that valve has even refused to go to dx10 for the foreseeable future. they're happy on dx9. they also bashed the ps3 in the same article. so no dx10 engine in the works for a little while more.

also, as far as i know, valve's monthly steam hardware survey is still the most effective and widespread effort of its kind. through the data collected, valve has determined that the majority of their players still don't have dx10 capable systems, and i mean we're talking 2.5%. the most common card was something like the geforce 6600gt? so they're designing their engine with relevant data in mind. they're really designing it for the people who are going to be playing it.
October 19, 2007 11:24:35 AM

V8VENOM said:
But seriously, make the graphics hardware a separate plug and play unit -- considering the cooling and power requirements of current video cards it would seem like a better solution - just the physics of making a high speed bus over specified length.


Yep... The cooling effect would be thing that should be considered. I by self have an external sound card (for electronical issues...). The bad thing is that it's not so easy to move the computer, and I think that external graphick boxes are not going to be very popular, so the price will be high...
But you are definitely right about the advantage of modularity and cooling. Only future will tell.
October 19, 2007 5:04:49 PM

HeyYou,

You lost me there, because one adds another light source (the flashlight) why would another light source (the sun) go away? That doesn't happen in the real world. In the real world you have multiple shadows from multiple source and they interact with each other.

Hopefully the Crysis final version (Nov 16th) will be better than it currently is -- cause right now I'd say graphically Crysis MP Beta is a 7 out of 10 compared to HL2 at a 9 out of 10. Factor in frame rates and Crysis MP Beta is about 4 out of 10 compared to HL2 10 out 10.

As for the external GPU -- agree, I think width "might" be the solution. But on the DAW front, especially when working with 5.1 or 7.1 surround you do need ns latency. For basic audio work sure 7-10ms is ok, but when you add in multiple buses each with 4-7 FX processors, 32+ tracks and then you have Virtual Instruments, you soon find a need for ns latency hence why "lightpipe" external audio processors exist. Sure this is very high end, but so would this be for a graphics solution.

Converters would be a big latency introducer, unless of course we got to a point where conversion was no longer needed -- now that is probably way out in the future 10-20 years maybe more.
October 19, 2007 6:11:16 PM

V8VENOM said:
HeyYou,

You lost me there, because one adds another light source (the flashlight) why would another light source (the sun) go away? That doesn't happen in the real world. In the real world you have multiple shadows from multiple source and they interact with each other.

Hopefully the Crysis final version (Nov 16th) will be better than it currently is -- cause right now I'd say graphically Crysis MP Beta is a 7 out of 10 compared to HL2 at a 9 out of 10. Factor in frame rates and Crysis MP Beta is about 4 out of 10 compared to HL2 10 out 10.
No matter the scene, characters will still have the same shadow directly under them; it's as if they each have a light right above them. The method Valve uses is vastly inferior to Crytek's, and most developers wouldn't use it with their "Next-Gen" games.

Half-Life 2: Episode 2
http://img248.imageshack.us/img248/3439/hl2200710191330...
http://img91.imageshack.us/img91/8871/hl220071019133526...
http://img337.imageshack.us/img337/9428/hl2200710191334...
http://img256.imageshack.us/img256/7861/hl2200710191332...

Crysis MP Beta:
http://img86.imageshack.us/img86/2299/crysis20071019134...
http://img153.imageshack.us/img153/3134/crysis200710191...
http://img504.imageshack.us/img504/1098/crysis200710191...
http://img148.imageshack.us/img148/6109/crysis200710191...

Even though Half-Life 2: Episode 2 isn't groundbreaking on a graphical level, I still love the game.
October 19, 2007 7:52:05 PM

I see what you're saying, have to agree with that. But on a texture level, Crysis MP Beta appears washed out to me. And the chickens and frogs don't add much to the experience other than reduce frame rates even more.

But if you really wanna pick on details Crytek's "wind" isn't really wind -- or at least I don't see any real world emulation there, just the stuff moving back and forth -- no real wind direction or speed.

October 19, 2007 11:07:21 PM

You want wind effects or any sort of weather effects, then Oblivion is great for those. I remember when it had a major thunderstorm and the trees were swaying together and the rain was being thrown in various directions. It was crazy really.

I for one would hope thet Crysis DX10 would have better shadows since DX10 enhances and enables higher resolution shadows. But still even without DX10 HL2 has a lot of things Crysis does not. The way people react to you when you shine a flashlight in their faces. Oh and Crysis will never have the G-Man. No one will.
October 19, 2007 11:25:46 PM

You miss V8V's point, you can make stuff look kinda windy, but even Oblivion the wind is still not wind. I love Oblivion, but if either game were able to do true wind effects, like increase resistance against your walking, change the arch of arrows or bullets, blow over objects or make debris fall with the wind, that would be awesome, but also require the power of a PPU or VPU-physics IMO.

Anywhoo, I think that level of realism is still a long way off.
October 20, 2007 1:01:56 AM

The fact that they have still left wind out of games makes me laugh. Total Annihilation had wind and that was in 1997, and it had a physics engine and it was true 3d on a bmp sneakily disguised as a texture. Old games were so much better. and along came source and said "first ever physics, w007 li3z''

I have not had any problems with hl2ep2, which looks much better than oblivion (badly made, badly textured, and just generally bad if you actually at closer to a wall than 1 meter.) The snow is not even volume dependent it is just a bad overlay and so is the rain.

Crysis will definitely be a lot better looking, the beta is just the beta, crysis is going to feel real, half life 2 episode 2 feels like a smoother half life 2, and thus a less real half life 2. The floors are not nearly as well textured as those in crysis, but overall it is very well polished and for the low requirements it looks pretty damn good. As far as i can see crysis will take a LOT more power to run cause in hl2 ep2 you get a LOT and i mean a LOT of object fade, like in the leafless dead trees disappear at 50 meters or so, whereas crysis keeps your trees(or turns them into crappy flat textures) at a much larger distance. They were pretty sneaky with half life, they let you go outside, but it is always in a valley and you never get to see very far, though you dont need to buy a new pc for it at least, but also it is like 3.5 hours long so i think crysis will wipe the floor with it, also, if i EVER get denied a combine sniper rifle EVER again i will personally march down to steam and kick all of the developers in the nuts. hl2dm is getting very old very fast, and q3 is dead, so i need a new multiplayer game, i am hoping crysis will fill this multiplayer void.

Oh i think you are having problems with your card cause it is the 1gb version, it gets even less attention than the 512, which is quite a lack of attention. And if you think you are missing out on the motion blur you might just not be seeing it cause it is very slight(and thus not intrusive and thus hard to see unless you flick your mouse) hope they tweak that in crysis multi.
October 20, 2007 1:17:10 AM

hannibal said:
So we only have to have two highend graphic cards... nice...
It would mean like having two separate game consoles. Half life 5 is only playable with ATI and Doom 5 is only with nVidia, if it goes to extreme. But you have a point! The gpu-maker that has more support via game engine makers is winner! And because there are support to one or another, the luser is user...

Seems to be that UT3 is again more ATI favored game... Along many Nvidia titles. So Nvidia has advantage because of earlier DX10 card release (or better pr division toward game houses!)

http://www.anandtech.com/video/showdoc.aspx?i=3128&p=4


I believe that they axed support for DX8, DX9a, and DX9b for HLep2.
October 20, 2007 3:35:46 AM

graphics arent everything
October 20, 2007 6:15:08 AM

lol, id take the graphics in crysis any day.
October 20, 2007 8:12:26 AM

I wouldnt.. Isnt nearly as impressive as everyone clames. Reminds me of how everyone boosted oblivion till i got it and was so dissapointed i downloaded almost 2 gigs of graphical enhancments to make it barable.
October 20, 2007 9:53:56 AM

Rabidpeanut said:
hl2dm is getting very old very fast, and q3 is dead, so i need a new multiplayer game, i am hoping crysis will fill this multiplayer void.


TF2 not floating your boat?
October 20, 2007 9:58:53 AM

TheGreatGrapeApe said:
Why would you believe that?

I believe you've got that wrong, reqs still say DX8 for HL2ep2 on Valve's site;

http://www.steampowered.com/v/index.php?area=game&AppId...


Source still does support DX8. Forcing DX8 in TF2's launch option had been a common work-around for alot of beta crashing people were suffering. Unfortunetly, the game does take a pretty serious graphical hit when using DX8 as well.
October 20, 2007 4:32:16 PM

Oblivion didn't do it for me either, I was "that's it?" -- there are a few nice shots (if you happen to find the right place) that came together well. But overall, I wasn't that impressed with Oblivion either.

I don't know how Crysis will ultimately look, I hope a lot better than the MP Beta. But if one has to dial down the Crysis graphics to get the same frame rates as HL2 EP2, then that sorta defeats the purpose no?

Yes, wind as in real directional wind with variable velocity, not just stuff moving back and forth. Physics engine or hear is a thought, use one of those 4 CPUs in a quad, hell use two of them to do wind -- should be more than enough processing power.

I think the hardware is here today to do all this and more, just gotta wait for the games/software to catch up.
October 20, 2007 6:29:20 PM

I doubt we've seen Crysis max settings yet. After all it is just eh demo at the moment, probably aren't HQ/UQ textures or anything yet.

And it isn't DX10 capable atm either.
November 4, 2007 3:32:22 PM

Well I for one enjoyed Oblivion as it gave a lot of open world to it and I'm not sure what you ran it on at the time but It looked really good to me. Especially if you downloaded the user made mods for the characters(for some reason they are all like 40+) and textures. It had a lot of other things though besides the graphics that I liked such as the story and the different abilities and a lot more.

HL/HL2 so far has been a captivating series. It has a great story and they end it each time making you ask questions and what will happen next. Not to mention that Valve was the first to include some way of grabbing an opponent or object and hurling them through the air at another person. A lot of games have trie to emulate the gravity gun but its still not the same.

I downloaded the Crysis SP demo and played it on 1280x1024 on very high settings and was not impressed. I mean yea the graphics are amazing, they use DX10 motion blur and god rays. It looked nice but the story was somewhat confusing. I guess I will have to wait till it comes out and play the whole thing.

As for MP with Crysis, thats not going to be as popular as HL2 MP or TF2. With the requirements of Crysis there will not be that many people who can run it especially in MP with 16 people running around. Now TF2 is easy to run on old systems and plus its different than most MP games. Its cartooney, fun and each character is different with different skills.

I just think that Crysis has a lot of hype. Yea it has nice graphics and can use DX10 but in comparison HL2E2 on DX9 can compare in a lot of places. And I am sure if Valve ever decided to go DX10 in a patch/update or even for Episode 3 they could easily ravage Crysis. But they keep in mind the bulk of PC gamers are not the "I have an Uber-high end dual socket quad SLI/Crossfire machine". Most PC gamers have low to mid to high. So they have to keep the game playable with nice graphics for everyone.

I wounder how Crysis will look at 1024x768 on low settings....
November 5, 2007 3:46:14 PM

Depends, if Intel deliver on their 2X performance increase with their soon to be released CPUs and nVidia deliver on their 2X promise on their soon to be released GPU, then we can at least get double digit frame rates rather than single digit at higher resolutions (1920 x 1080 or higher) with with max eye candy turned ON.

But so far I'm NOT impressed with DX10 at all. Installed Microsoft's much hyped "FSX Acceleration" and set it to DX10 mode and frame rates were no better and the texture detail was terrible -- made the scenery blurries (low textures) even worse -- definitely not a show case for DX10.

So far, the best looking game I've played is still HL2 EP2 and that is all DX9.

November 5, 2007 4:37:07 PM

V8VENOM said:
So far, the best looking game I've played is still HL2 EP2 and that is all DX9.
So you haven't played Gears of War I imagine?
November 5, 2007 4:50:08 PM

Considering it is NOT supposed to hit retail til tomorrow (Nov 6th) and it is a XBOX 360 port -- I'm gonna wait and see on it.

I have seen some screenshots but close up texture quality doesn't appear to be as good as that found in HL2 EP2
November 5, 2007 5:06:19 PM

V8VENOM said:
Considering it is NOT supposed to hit retail til tomorrow (Nov 6th) and it is a XBOX 360 port -- I'm gonna wait and see on it.

I have seen some screenshots but close up texture quality doesn't appear to be as good as that found in HL2 EP2
Half-Life 2:Episode 2's textures are horrible; you're crazy if you think this game looks better than Gears of War on either platform. :lol: 
http://img248.imageshack.us/img248/3439/hl2200710191330...
http://img91.imageshack.us/img91/8871/hl220071019133526...
http://img337.imageshack.us/img337/9428/hl2200710191334...
http://img256.imageshack.us/img256/7861/hl2200710191332...
November 5, 2007 5:11:26 PM

My HL2 EP2 textures don't look like that? You sure you have them turned up and the ATI Control Center settings setup correctly? Your screen shot .jpg have high compression?

Crazy, definitely. Would need to see the game on my PC before I comment, screen shots don't always do it justice.
November 6, 2007 8:59:50 PM

I have to agree with V8Venom. There has to be some setting turned down or something wrong with your setup because even my textures look better than that and I can only go to 1280x1024. I notice that in some they are very blurry where they blend the rock with the dirt. Something wrong.

And V8Venom you do always have it set to 1920x1080 since you have a Crossfire config.

All I can say though is that Gears of War will not be the normal port. Its going to be like Halo. Direct port with more weapons, and everything. Same thing will be for Gears of War. Or so they say.

But in real terms I doubt GoW will have the same quality as HL2 E2. The physics and people are the best of anything I have seen.

Also DX10 changes a lot V8. I wish there was a game that you could set it to DX9, take a screenshot and then DX10 to take a screen and see the difference. But then again I love God Rays.
!