First, I must say I was amazed, positively amazed to discover how well the Source Engine runs and looks on a mediocre configuration such as mine (Radeon 9600SE, 512MB Ram, PIV 2.8Ghz). I was able to play at a very acceptable framerate with models, textures, shaders, shadows on high, 1024x768 w/AA4X (looks awesome) and AF16X. As a matter of fact, I still settled for medium textures, low shaders and shadows, trilinear filtering and AA2X as the only effect I could perceive was an amelioration of performance.
Yes, that was a refreshing experience, because after having tried The Elders Scrolls: Oblivion on my system, I thought I would eternally be stuck with older games: it just wouldn't run decently. It was the graphics, or the framerate, but never both. Source delivers everything even through a low-end system, and that blows.
I was also amazed to find out that unlike Oblivion, HL2EO used an HDR technique that my video card could do. Finally, I could see with my own eyes, in real-time, how it looked like!
I must say I wasn't impressed at all. In fact, I was happy to go back to normal lighting, gain a few precious fps back and also a much more natural, realistic lighting.
From what I've seen on screenshots, HDR seems to do a fantastic job in Oblivion, whereas normal lighting looks pale, static, and not convincing. (I could only try normal and bloom, and definitly settled for bloom).
In HL2EO, it is a different matter. Valve had already designed original HL2 with only normal lighting, and tried their best to make it look realistic: and HL2 looks actually very good even though it doesn't have HDR. Yes, they now added it on top of everything. And is it good? No.
My biggest concern with HL2EO HDR is that every light source becomes a 2000W projector that can set you on fire from a respectable distance. For example, as we (Gordon and Alyx) were crawling in a basement, we discovered a little computer room with a small vertical neon on the wall. And of course, there was a patch of blinding, intense and pure white light on the opposite brick wall, and you had to stare at it for about 6 seconds before you could see any detail; and even then, it had that radioactive-retina-burn-green shade. How ridiculous is that? Switching to normal lighting, the light was much more diffuse, soft and realistic.
If you want to see for yourself, watch the "Half-Life 2 High-Dynamic Range Rendering Demonstration" at gamespot. The light is way too pure, too intense, the contrasts between light and darkness are ridicoulsy exagerated. You never see that in real life except if you stare at the sun. (Actually, the scene when you are staring at the sun looks really good in HDR).
And that's really the only thing HDR does in this game, except for lowering your framerate. Right now, this technique is just show-off; it lacks realism, moderation and precision. It make the world look like you're watching it through night vision goggles, getting blinded at every single decent light source.
While the normal screenshot looks believable, though a little too dark in the foreground, the HDR, while it does a good job of lighting up the foreground, is a total disaster in the background. It might look funny, but it's not realistic.
The first screenshot is a total lighting mess; and that's just because I stared at the (slightly) darker sky for 5 seconds before taking it. The blinding effect fades out in a strange way, but 3 seconds later it looks good.
I didn't take more screenshots because switching from HDR to normal is a 2-minute torture for my system.
Valve saturated their engine with countless bloom effects and tone mapping to make it appear as if its HDR. When really its not true FP16 HDR, which is why your machine appears to run it so well. Dont get me wrong, the source engine is quite efficient, however the lighting effects that your seeing are over -the-top wannabe showplay BS.
Glad to hear your liking the game though.
As you mentioned the "blinding neon light" ---thats exactly what I'm talking about. True HDR isnt like that, its more along the lines of your second expereince, sutble, soft and more natural looking.
Valve saturated their engine with countless bloom effects and tone mapping to make it appear as if its HDR. When really its not true FP16 HDR, which is why your machine appears to run it so well.
OpenEXR FP16 HDR is NOT the only version of HDR it's much older than their limited description (which is strict for marketing reasons IMO), and in fact the introduction of HDR on VPU for SIGGRAPH 1998 was integer based, and Valve uses a moreadvanced adoption of the same techique, which isn't quite as advanced as ILM's, but then again no one fully utilize's ILM's methods either, heck FP32 per channel is in thei specs, and FP64 per channel is a version/implementation of HDR for photography. So really, who can say what HDR truely IS other than the originators back in the late 80s?
Since when can a 9600 run HDR and I'm stuck with Bloom?
A 9600 can run full HDR in Half-Life 2: Episode One with the latest ATI drivers. At least, that's my experience. It cannot and will never do the Oblivion HDR however.
In most video games, there is something wrong about HDR. Theoretically, it should look more realistic, right? Well it doesn't. Here's a classical example, with AOE3 I think: WITH/WITHOUT HDR
The HDR technique does a good job, once again, of lighting up the foreground correctly: the facade looks too dark with normal lighting. However, there are some problems: the sky is completely messed up, the clouds have disappeared from view; the horizon have also disappeared; the lighting looks generally blurred.
The Bethesda team has done a good job of implementing HDR correctly, I think. However, even Oblivion HDR has it's flaws. Look how the clouds on this screenshot are saturated. This never happens in reality.
My biggest concern with HL2EO HDR is that every light source becomes a 2000W projector that can set you on fire from a respectable distance.
That's odd, because I thought HL2EO had the best HDR I've seen so far, because it was only really noticeable when you looked for it, just like real life. Most games, and 'Lost Coast' seriously overdid it just so people would see it was there and go 'oooh'.
Maybe it's an SM3.0 vs SM2.0 thing, and the 9600 can't render it properly.
I personally am impressed with HL-2: EO's implementation of HDR. I think the only problem is, it may be a little over-used and the bloom effects are a bit too intense. But, nevertheless, I believe that HDR truly improves the realistic look of the game.
You ever walk outside, especially during a sunset, and the lighting feels so much more "golden" and "vibrant" than any game you've seen before? I think that using HDR achieves that, so I believe it looks way better with it enabled on full.
Since when can a 9600 run HDR and I'm stuck with Bloom?
I'm thinking the same thing. I have to run 1024x768 to run HDR with decent framerates (instead of 1280x1024) yet you can run full HDR effects with the 9600 without a horrible performance hit?
P.S. I must agree, the source engine is very efficiant, and smooth. I believe that HL-2's graphic quality is way better than BF2's yet look how much better Source runs as opposed to BF2.
ya, I have some of the mods. They do improve it, but that is not saying much. I think it is unacceptable that I have to dl a mod to get the game to look passable after paying $50 for a AAA title. They should have worked out the craptacular textures out at long distances before release.
Even w/ the mods it is still texture work circa 2001 IMO. Not just my system either, you can see it in shots from the xbox360 and other pc's when you look at far distant mountains. Morrowind at least looked uniform. Oblivion looks so good close up that the distance crap sticks out like a sore thumb to me.
fair enough, I just thought that the "yellowish" tint was b/c of the brown mexican city you were in, not HDR. I think that if you compare it to any video of mexico city or most middle eastern cities on the news you get the same yellow-brown tint to everything. I felt that what you noticed about hdr in that game was that you could see how much the sun at your back vs in your eyes makes a difference, and moving from outside to a bunker was tough w/o caution as a dude could be right in the door w/o you knowing it.
still, if you dont like the color scheme then you just dont like it. I would suggest not moving to the middle east though if that is the case.
This is definetly opinion, but GRAW imho looked like total ass with HDR.
It made the enviroment a yellowish/red flavor that made me want to puke.
I agree; the HDR in GRAW is ass. It's not the yellowish tint that bothers me (I think it's realistic) but it's the HDR that just isn't very good.
Not to mention I get ~20FPS in that game with 1280x960 res with most settings on medium. It's the same story w/ Rainbow Six: Lockdown. Such a shame, porting the Rainbow Six series from a console ruined it.
It's the same story w/ Rainbow Six: Lockdown. Such a shame, porting the Rainbow Six series from a console ruined it.
wait a minute... porting Rainbow Six from a console ruined it?! You are backwards there methinks
Rainbow six was originally a PC game, as was Ghost Recon. RSix on the PC created the tactical shooter genre, and was incredible. The first GR was also killer. Both were destroyed when ported TO the console. To even think that the console versions were good (read: better than the PC versions) is laughable.
GR was ported to the console and did well commercially (I honestly never played it), but the sequel was console only and bombed. This third one, GRAW, is just a very good game (360 or PC) and like the original GR (at the time) required a massive system to run it. I am sorry that your machine can't do well, but that was a similar situation w/ me on the first GR, I tried to do it w/ a tnt2 32 meg card and failed, upgraded to a geforce3 just for that game.
Anyway, if you dont like the lighting that is fine, you can have a differing opinion. My position is that I like the hdr in both hl2 and GRAW better then oblivion. (not that hdr in oblivion is bad mind you) You may find that the hdr on GRAW looks much better if you run it on a system that can enable everything, which first requires 512 megs on your video card. (firingsquad had an article that showed the game uses ~430 megs on the card if everything is enabled)