Graphics Cards Over the Edge: Playing Oblivion

robwright

Distinguished
Feb 16, 2006
1,129
7
19,285
Oblivion, the fourth in the Elder Scrolls RPG series, is a 3D role-playing game that ups the ante for gaming graphics quality. It features lighting effects with HDR rendering, unbelievable visibility, weather and daylight changes, hyper-realistic landscapes, and detailed game characters. All of these make outrageous demands on PC hardware. Is it worth it? Our screenshots show you exactly what your investment will buy.
 

GM

Distinguished
Apr 14, 2004
34
0
18,530
Rob,

Great article. Being Canadian, I like to see ATI taking top honours in the frame rate and quality categories... :)

One thing I noticed however. I was looking at the screen shots on Page 4, specifically the water effect (oblivion-geforce-vs-radeon-2big.jpg) and something didn't quite look right to me... putting my nagging internal voice away, I continued on in the article. Then, on Page 5 when looking at (oblivion-wasserbig.jpg) it hit me... the bridge and dock are not reflected in the water. When looking at the water reflection just below the bridge, instead of seeing the reflected bridge in the water, you see the sloping hill... as if the bridge wasn't there at all... Listening to that inner voice, I went back to (oblivion-geforce-vs-radeon-2big.jpg) and noticed the same effect there.. the columns and statue go into the water, but are not reflected back the same way that the spire on top of the hill is.

Is this an effect of the graphic cards' water effects, or due to a difference in the way the objects are defined in the game? I'm assuming that the properly reflected objects are non-interactive game elements, like movie backdrop paintings, and the incorrectly reflected objects like the bridge can be used by the character. If this is the case, then the gamer can get hints during game play by examining the way things are rendered to see whether they can, or must, interact with them (for example, a door in a hall way that can be opened might be rendered differently than a door that is just part of the backdrop).

Thoughts?

GM.
 

Flakes

Distinguished
Dec 30, 2005
1,868
0
19,790
before i start im not a nvidia fan boy, but i didnt actually think the the ati pictures looked right there was visible stripes on the pic with the horse and the other screen shots had purple tints to them, the gefore looked more realistic, just personnal opinion i guess. ive seen some lighting effects look better in other games using a ATI card.
 

ivoryjohn

Distinguished
Sep 20, 2001
174
0
18,680
I'm pretty sure this was addressed in one of the many tweaks for the game. the engine actually supports numerous visual effects that were toned down in the shipping version to promote compatibility with all machines.

An enthusiast, with tweak guide in hand, can vastly improve the visual effects and the framerate (especially with a powerful dualcore cpu and top notch GPU).

Some things didn't work. The article mentioned the shadows on grass and the shadows on self. Most players turn them off for improved visual appearance and FPS.
 

Agraza

Distinguished
Apr 27, 2006
12
0
18,510
Why is the 6800 GT being compared to the X1900 XTX? That is the last generation nVidia, and the newest generation ATI.

The only notable difference between a 7900 GTX and an X1900 XTX, is that the ATI cards can use HDR and AA simultaneously in Oblivion, whereas a 7900 GTX has to pick between the two due to conflict of resources.

Regardless of whether it improves performance or not, shadows on self looks horrible when turned on. Whatever effect its supposed to produce doesn't work in half of the lighting conditions regardless of your system.
 

enewmen

Distinguished
Mar 6, 2005
2,247
3
19,815
Was this message intended for enewmen?

Why is the 6800 GT being compared to the X1900 XTX? That is the last generation nVidia, and the newest generation ATI.

The only notable difference between a 7900 GTX and an X1900 XTX, is that the ATI cards can use HDR and AA simultaneously in Oblivion, whereas a 7900 GTX has to pick between the two due to conflict of resources.

Regardless of whether it improves performance or not, shadows on self looks horrible when turned on. Whatever effect its supposed to produce doesn't work in half of the lighting conditions regardless of your system.
 

twile

Distinguished
Apr 28, 2006
177
0
18,680
I give the graphics of Oblivion mixed reviews. Some of the effects are really spectacular, though it seems Bethesda has fallen prey to what everyone else is: Using mapping techniques instead of good ol' polygons.

In my humble opinion, the only time it's completely justified (as of now, this should change in the future) to use different types of mapping to generate the illusion of 3D depth on flat objects is with the non-edge portions of flat or concave items, and in cases where there won't be any loss of realism. Furthermore, in many cases the bumpmapping should only be used for depression and not raised surfaces.

The reasons for this are simple. While having an intricate raised pattern on the handle of a sword, ridges on armor where two pieces were attached, or raised bolts on a metal device all look nice from many angles, when you look at the profile of such an item there is a loss of detail in Oblivion under cases such as these. The sword pattern, armor ridges, and raised bolts are all flat when viewed from extreme angles. Where things should be blocking your view, there is nothing. This first annoyed me when I was talking to someone wearing leather armor and I saw the nice lighting on the raised bumps of his armor where two pieces of hardened leather were attached--however, the illusion was destroyed as my gaze followed it up over his shoulder. The top of his shoulder plate was flat, even though the texture and lighting tried to make it look like the ridge continued. To keep something like this from happening, model geometry "should" only be simplified in regions where you will never see a profile of the affected surface, such a piece of glass in a window or boards/stones in a room with only concave corners. Also, the simplifications should be enough that the physics of other objects interacting won't give away the flat nature of the model.

Looking on the bright side, at least they didn't abuse bump mapping like Doom III did. I remember when I first saw a traffic cone which had the holes in it bumpmapped in. You couldn't see through the traffic cone.

Textures too... I have them turned up as high as they'll go, but still they're not so kind to the eyes (and I'm not talking about textures on a mountain a few thousand feet away, which are disgustingly simplified. Tom's own screenshots show that off). I understand that they have a huge number of objects that need to have textures, and they can't be excessively detailed. Still, I remember playing Alien vs Predator 2, and even when I was up to my nose against a wall and zoomed in several times, the texture was not pixelated... they used some sort of graphics trick to add more details when you looked closer at something. And that was cool.

Overall the graphics for Oblivion are rather pleasing, it's just that the "Hey! Let's decrease model geometry and use bump/normal/parallax mapping instead!" approach is starting to bother me. They're *supplements* and not *replacements.*
 

ToddMcF2002

Distinguished
Dec 29, 2003
2
0
18,510
I'm not trying to troll, but this article is losing me.

First off - those screen shots look absolutely horrible. I can't believe noone else is questioning them. What is up with that? Bleached out to the point there are no shadows? That pic of the horse labeled "1900XTX"? What happened there? Are you running brightness and gamma so high you are seeing all these horrible rings? What did you do to Oblivion???

What exactly was this article about anyhow? It seemed like merely random observations of the graphics engine. You make a comment about the game shaders and how ATI's implementation is superior - but you have no benchmarks? Are you assuming we are cross referencing to the Firingsquad and Anandtech articles?

You have no conclusion to the article. You assume the graphics card is the "key" while the game is also serverly cpu bound. You mention God Mode for some unknown reason - with the astute observation that... it makes the game easier???

Is this Tomshardware???
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Oblivion, the fourth in the Elder Scrolls RPG series, is a 3D role-playing game that ups the ante for gaming graphics quality. It features lighting effects with HDR rendering, unbelievable visibility, weather and daylight changes, hyper-realistic landscapes, and detailed game characters. All of these make outrageous demands on PC hardware. Is it worth it? Our screenshots show you exactly what your investment will buy.
Well, I have mixed feelings about this article. The first thing I noted was, on the second page, the following line: The American version of the game has been available in stores since March 23, 2006,Quite surprising to me; I picked up my copy (without a reserve, mind you) on the 21st, the day it was actually available in stores. (the game shipped, to ALL destinations, in America and Europe, on the 20th) I personally think that to be a bit of an embarrassing error on the part of TwitchGuru.

I’m surprised that little mention was given of using the .INI file to tweak the visuals. A lot of the complaints about the visuals that people have can usually be resolved very easily there. They were hidden away because, often enough, it was a setting that it appears the Xbox 360 couldn’t handle, for some reason or other, and hence to maintain that the game “runs on 100% settings,” the settings were removed from the game’s options, but remain for adjustment. At least, that’s how it came across to me.

Also, when it comes to performance, I’m scratching my head over the means of testing. Like many benchmarks, I suspect that the testers may have largely recording during times when the game was loading a new cell, which is a time that will effectively bring a system equipped with any video card to its knees, because at those points, the video card isn’t being the bottleneck; it’s the CPU, as well as the RAM, and even the hard drive. I normally pause during then, and resume when it’s finished in a few seconds; as a plus, when the game doesn’t have to process AI or physics, the loading goes by about 3-4 times as fast.

At any rate, the performance numbers they gave were a bit more grim than actual performance seems to me; I average 20-25fps outside, (25-30fps in clearer mountainous areas, 15-20fps in the deep parts of the forest) well good enough as I’m not playing Counter-Strike, on my X800XT; I’m also running at 1024x768, with x6 AA, x8 AF, and all of the settings ranging from 100% to well beyond that, using the .INI file. First off, I fixed the water so that it reflects everything. Then I adjusted the scaling distances, and then the blood decal stay length. As for the AF, I note that raising it to x16 will mean the game almost always uses significantly more than 256MB, which will cause an additional, undue performance drain on my system. Granted, I can then see the mip-mapping, but the few extra FPS were well worth it.

Rob,

Great article. Being Canadian, I like to see ATI taking top honours in the frame rate and quality categories... :)

One thing I noticed however. I was looking at the screen shots on Page 4, specifically the water effect (oblivion-geforce-vs-radeon-2big.jpg) and something didn't quite look right to me... putting my nagging internal voice away, I continued on in the article. Then, on Page 5 when looking at (oblivion-wasserbig.jpg) it hit me... the bridge and dock are not reflected in the water. When looking at the water reflection just below the bridge, instead of seeing the reflected bridge in the water, you see the sloping hill... as if the bridge wasn't there at all... Listening to that inner voice, I went back to (oblivion-geforce-vs-radeon-2big.jpg) and noticed the same effect there.. the columns and statue go into the water, but are not reflected back the same way that the spire on top of the hill is.

Is this an effect of the graphic cards' water effects, or due to a difference in the way the objects are defined in the game? I'm assuming that the properly reflected objects are non-interactive game elements, like movie backdrop paintings, and the incorrectly reflected objects like the bridge can be used by the character. If this is the case, then the gamer can get hints during game play by examining the way things are rendered to see whether they can, or must, interact with them (for example, a door in a hall way that can be opened might be rendered differently than a door that is just part of the backdrop).

Thoughts?

GM.
Simple solution to your long post’s question: those reflections are disabled manually. They can be re-enabled through the Oblivion.INI file, under the “water” heading, as a bunch of lines that look like “wReflectetc=0”. Simply go through them all, and switch the 0s to 1s, and they’ll reflect. By default, only the ground, and city walls/towers are reflected.

As for the performance of the game on ATi’s hardware, I personally put it up to nVidia’s attitude, which I am actually growing a bit more displeased with each day. It goes along the lines of “oh, we don’t need more pixel shaders. It’s TEXTURING POWER that games need most.”

Such cost them the performance crown, BIG TIME, back in 2002, when the GeForce 4 Ti was handed an astounding defeat by the Radeon 9700pro. This was covered on this site by the article “ATi Radeon 9700 PRO - Pretender to the Throne.” Yes, as hard as it sounds to believe, they were of the exact same generation of cards; the previous one had been the GeForce 3 and 3 Ti vs. the Radeon 8500.

At any rate, perhaps nVidia might wake up again, and push their architecture in a similar manner to how they did to reach the GeForce 6; I was pleased with that, to see them FINALLY ditch the 4 pipe/8 TMU arrangement that had been seen in every flagship card of theirs since the GeForce 2 GTS. It’s a shame that they’re not moving much since then. I’m actually an unbiased sort of person when it comes to video cards, but I currently can’t say I have a bright outlook for the G80 against the R620, unless they really change course.

Why is the 6800 GT being compared to the X1900 XTX? That is the last generation nVidia, and the newest generation ATI.

The only notable difference between a 7900 GTX and an X1900 XTX, is that the ATI cards can use HDR and AA simultaneously in Oblivion, whereas a 7900 GTX has to pick between the two due to conflict of resources.

Regardless of whether it improves performance or not, shadows on self looks horrible when turned on. Whatever effect its supposed to produce doesn't work in half of the lighting conditions regardless of your system.
Well, there is, according to effectively every benchmark, the results that the X1900XTX quite out-performs the 7900GTX. As I predicted since the X1900’s release, it likely has a lot to do with the fact that the X1900 has 48 pixel shaders and 16 texture units, as opposed to the 24 and 24, respectively, in the 7800/7900. Since Oblivion generally only uses one color texture per surface on models, (though at times up to four on the ground) yet applies at least three, if not more, shader maps to each surface.; every surface has at least a normal-map for angular detail, a specular map for shinyness, and a diffuse map to change how the material reacts to different lighting. (some objects, for instance, can seem to change color in ways not directly related to the lighting’s color) Further shaders would include parallax mapping to provide more detailed features, namely pock-marks in rock, as well as the active environment-mapping shader for the water. This stacks up to a shader load that dwarfs the texture usage.

As for the self-shadows, it appears to be a problem of shadow angle; for some ridiculous reason, Oblivion’s shadowing system is adjusted so that no matter what, characters cast shadows DOWN. This means that standing next to a small campfire in a cave, you cast shadows only on the floor, and not on the ceiling like you should. This seems to result in problems for the self-shadowing part, resulting in those “Crackling” problems with fragments of shadows covering people.

I give the graphics of Oblivion mixed reviews. Some of the effects are really spectacular, though it seems Bethesda has fallen prey to what everyone else is: Using mapping techniques instead of good ol' polygons.

In my humble opinion, the only time it's completely justified (as of now, this should change in the future) to use different types of mapping to generate the illusion of 3D depth on flat objects is with the non-edge portions of flat or concave items, and in cases where there won't be any loss of realism. Furthermore, in many cases the bumpmapping should only be used for depression and not raised surfaces.

The reasons for this are simple. While having an intricate raised pattern on the handle of a sword, ridges on armor where two pieces were attached, or raised bolts on a metal device all look nice from many angles, when you look at the profile of such an item there is a loss of detail in Oblivion under cases such as these. The sword pattern, armor ridges, and raised bolts are all flat when viewed from extreme angles. Where things should be blocking your view, there is nothing. This first annoyed me when I was talking to someone wearing leather armor and I saw the nice lighting on the raised bumps of his armor where two pieces of hardened leather were attached--however, the illusion was destroyed as my gaze followed it up over his shoulder. The top of his shoulder plate was flat, even though the texture and lighting tried to make it look like the ridge continued. To keep something like this from happening, model geometry "should" only be simplified in regions where you will never see a profile of the affected surface, such a piece of glass in a window or boards/stones in a room with only concave corners. Also, the simplifications should be enough that the physics of other objects interacting won't give away the flat nature of the model.

Looking on the bright side, at least they didn't abuse bump mapping like Doom III did. I remember when I first saw a traffic cone which had the holes in it bumpmapped in. You couldn't see through the traffic cone.

Textures too... I have them turned up as high as they'll go, but still they're not so kind to the eyes (and I'm not talking about textures on a mountain a few thousand feet away, which are disgustingly simplified. Tom's own screenshots show that off). I understand that they have a huge number of objects that need to have textures, and they can't be excessively detailed. Still, I remember playing Alien vs Predator 2, and even when I was up to my nose against a wall and zoomed in several times, the texture was not pixelated... they used some sort of graphics trick to add more details when you looked closer at something. And that was cool.

Overall the graphics for Oblivion are rather pleasing, it's just that the "Hey! Let's decrease model geometry and use bump/normal/parallax mapping instead!" approach is starting to bother me. They're *supplements* and not *replacements.*
Well, part of this has to do with the perspective of the user, when it comes to the use of detail-related shaders.

First, note that, as the article states, there is NO bump-mapping in use; bump-mapping technically isn’t even a shader in the sense used today, as it’s been an option for graphics cards since DirectX 6 and 7, while programmable shaders weren’t available until DirectX 8.0. Instead, the game uses normal-mapping; while you are correct, that bump-mapping is only any good for shading depressed surfaces, normal-mapping improves by adjusting the “normal,” or light-angle, of the particular texel, so that each appears lit according to not just if it’s actually lit, but by how much, as surfaces angled against a light are less lit than those facing it directly. This has the added advantage that, because of the reduced contrast between two areas, you don’t get that “plastic wrap” appearance of bump-mapping.

As for the actual proper use of normal-mapping, they are used because it’s not actually feasible to increase the polygon count farther than it already is, which is quite high, almost always a 6-digit number for each outdoor scene in Oblivion. It has to do with the way graphics cards were made; the current generation of “flagship” cards has a lot of pixel shaders; even the 7900 cards have 24, while the X1900s have a whopping 48. However, each has only 8 vertex shaders; this is a great disparity that has largely remained since the introduction of these shaders. To think on it, it’s only about 15 or so times as much vertex processing power as were seen in the best video cards around the time of the original Unreal Tournament! So in a sense, these shaders are a NECESSITY to gain “acceptable” visual quality.

Fortunately, a big change in the future is coming, in the form of the “unified shader architecture.” Starting with ATi’s next-generation chipset, and being following at an unknown date by nVidia, (I’m crossing my fingers that it might actually be G80) we’ll finally see an end to the limited number of vertex shaders; while the top cards of today can only handle 8 verticies per clock cycle, the Radeon X2800s will be able to handle 128. I think then, we’ll see games start to change toward having a lot more polygons, and less instances of features being described by normal-maps. But for now, they’re here to stay; how else are you going to make, say, a suit of chain mail look realistic? The polygon count otherwise required for decent-looking rings would be obscene, easily totaling into the millions.

Lastly, on the term of the LOD textures, those appear to largely be due in part to the memory limitations of most hardware, particularly that of the Xbox 360. Keep in mind that even without anisotropic filtering, Oblivion uses up a whopping total of some 200MB of video RAM; using AF pushes that further, as I’ve described at the top section of this post. It’s possible through the .INI to adjust this number; open it up, and under the “general” section, look for the line “uGridsToLoad.” It will start at 5, indicating that the game will load a 5x5 grid of cells at full detail; the rest will be “LOD cells,” which will contain only trees, (typically as 2-polygon “billboards”) and a scaled-down surface and texture, with a normal-map made to help offset the low polygon count of each cell. At any rate, set this number to either 7 or 9; be warned: if you have less than 1024MB of system RAM, this will likely crash your system if you try playing Oblivion, and in those cases, it will still likely have a negative impact on performance, as it will increase video RAM usage, possibly well over the recommended limit for a 256MB video card. Of course, users of 512MB video cards will have no problem.

I'm not trying to troll, but this article is losing me.

First off - those screen shots look absolutely horrible. I can't believe noone else is questioning them. What is up with that? Bleached out to the point there are no shadows? That pic of the horse labeled "1900XTX"? What happened there? Are you running brightness and gamma so high you are seeing all these horrible rings? What did you do to Oblivion???

What exactly was this article about anyhow? It seemed like merely random observations of the graphics engine. You make a comment about the game shaders and how ATI's implementation is superior - but you have no benchmarks? Are you assuming we are cross referencing to the Firingsquad and Anandtech articles?

You have no conclusion to the article. You assume the graphics card is the "key" while the game is also serverly cpu bound. You mention God Mode for some unknown reason - with the astute observation that... it makes the game easier???

Is this Tomshardware???
I do agree, the article leaves much to be desired. The settings that they’ve used are pretty questionable, as well. Thankfully, at least part of it, it seems, is due to the fact that Oblivion looks far better in-game than in screenshots.

As for the performance of the game, I’ve found that at different times, the game can be bound to different components; for many people’s experiences, it can even be the speed of their hard drives as it loads more cells. In the great forest at a high resolution, the game will pretty much ALWAYS be GPU-bound, not CPU-bound. Of course, what component is the bottleneck will change if you, say, have an extremely weak component.

This is actually something that amazes me; many people, and even much of the gaming media, fail to grasp the mechanics of a game’s performance. Comments such as “a faster video card will be all you need to improve gaming speeds,” and the like are far more prevalent than they should be.

As for the benchmarks, it’s a real shame they didn’t provide any, (just like when THG opted not to make benchmarks from 3Dmark 2006) as most of the ones I’ve seen, even at major sites, often seem flawed to me.
 

godlyatheist

Distinguished
Nov 13, 2005
439
0
18,780
At any rate, the performance numbers they gave were a bit more grim than actual performance seems to me

Yep yep. My 6800GS with 3200+ does 20fps average at 12x10 with 2x aniso and HDR+ all effects and coolbit tweak. At 10x7 i get 30-35 average. This was when i was using the 84.43 beta driver. That driver messes up my icon everytime i reboot so i got rid of it and installed 84.21. Boy was I surprised when using the older driver!! At 10x7 i only get 15fps and 20's at 8x6. Oh, the regular driver still mess up my icons. What is going on with these nvidia drivers? I guess beta drivers are not that good of an idea...............
 

SteveO

Distinguished
Dec 21, 2001
75
0
18,630
I have oblivion on the xbox360, and from what ive seen in the article my xbox360 has much better graphics an no noticeable preformance issues. I own a sempron3100+,512ram,bfg7800gs oc, with raid 0. I would like to get my hands oblivion for pc so i could compare to the xbox360 version. Im waiting for an A64 3700+ skt754 to arrive.
 

Agraza

Distinguished
Apr 27, 2006
12
0
18,510
Trust me, the 7800/7900 series and X1900 series produce superior graphics than the xbox, and at higher resolutions. The x1900 XTX or whatever its called, is an insane piece of hardware that makes the game shine. Furthermore there are mods that further improve the appearance of the game, as well as many more graphics customizations present in the .ini and the in-game menu. From what I've heard the xbox version can only specify the brightness of the game.
 

SteveO

Distinguished
Dec 21, 2001
75
0
18,630
Im sure they wont be equal. I spent much more on the PC. The Xbox360 has undoubtly better graphics for around 300 dollars. I dont have the money to buy every new pc technology that comes out. My system runs neverwinter nights at full blast with very little preformance drop on my 7800gs. Im comparing that to my previous card ATI X1600. Which really sucked. The XBOX 360 is the better buy for me. Price/preformance and great graphics courtesy of microsoft and ati. I just want to compare the oblivions to see the actual diffrences in preformance and graphics. There will never be an equal comparison even with the same graphics cards. Everything is Relative.
 

SteveO

Distinguished
Dec 21, 2001
75
0
18,630
Trust me, the 7800/7900 series and X1900 series produce superior graphics than the xbox, and at higher resolutions. The x1900 XTX or whatever its called, is an insane piece of hardware that makes the game shine. Furthermore there are mods that further improve the appearance of the game, as well as many more graphics customizations present in the .ini and the in-game menu. From what I've heard the xbox version can only specify the brightness of the game.
Im sure your right, and i would like one for me, but then i would have to get another motherboard and cpu.In about a year there will be a card way better most likely. I think ill wait to plan what my next build will be.
 

sincraft

Distinguished
Apr 18, 2006
131
0
18,680
BLAH BLAH BLAH...
Useless information.

Who wrote that article? A 20 year old with a 'flare' for writing but obviously misses the whole idea of an article?

no seriously, not trying to sound overly critical. But uh, was this a review...a test...or a benchmark...or..........a kiddies l33t d00dz g0d m0dz discussion?

If it were a game review, it lacked HORRIBLY telling us a THING about the game other than showing some aspects of the graphics engine, BUT it certaintly wasn't about the graf(x) engine because...it didn't even cover 10% of the graphics engine. It wasn't a discussion on how hardware intensive the game is, because uh...it only mentioned grafx cards...barely.
Hmm and it definetely wasn't a benchmark, because - there are neat charts or even spouted numbers to support anything.

All I want to see is cpus and gpus and other techie hardware garbage in comparison and contrast with FPS - you know...a freaking benchmark test telling some of us that:
1. what we have is capable of running the game
2. what we need to buy to run the game
3. how well will the game run if we buy new stuff to run the game or run the game better.

Seems pretty easy, although somehow this article over-complicated NOTHING !

Think I'll go delete a favorite or two - more interesting than that article...er...whatever it was.

S
 

twile

Distinguished
Apr 28, 2006
177
0
18,680
Im sure they wont be equal. I spent much more on the PC. The Xbox360 has undoubtly better graphics for around 300 dollars. I dont have the money to buy every new pc technology that comes out. My system runs neverwinter nights at full blast with very little preformance drop on my 7800gs. Im comparing that to my previous card ATI X1600. Which really sucked. The XBOX 360 is the better buy for me. Price/preformance and great graphics courtesy of microsoft and ati. I just want to compare the oblivions to see the actual diffrences in preformance and graphics. There will never be an equal comparison even with the same graphics cards. Everything is Relative.

Do not forget that, just as the non-GPU parts of a PC add a fair bit of expense onto a $300-400 graphics card, so does the HD-TV and theater surround sound system for an Xbox 360 setup add just as much to the $300-400 graphics card. If you want to do price vs performance comparisons, do it with the true price or the gaming-only price. All non-GPU components of a computer have non-gaming uses thus shouldn't be considered a part of the price, just like all non-Xbox components of a home entertainment center have non-gaming uses thus shouldn't be considered part of the price. The computer is an appliance, the home entertainment center is an appliance--the GPU and Xbox are just graphics add-ins, which conveniently both cost between $300 and $500 depending on the quality of gaming you want.

Don't turn this into a console vs PC war. Different approaches from different people with different needs.
 

twile

Distinguished
Apr 28, 2006
177
0
18,680
Hey, I'm not really a graphics expert, I just know what I see and try to reason my way through things I don't fully understand. All's I know is that people are relying too heavily on various techniques to improve appearances without higher model geometry. And that doesn't really do it for me. I understand there are some cases where it's not feasible to do accurate wireframes of objects, I'd just like it if a more stringent process was used in deciding where to stop with the polygons and start with the (fill in the blank)-mapping and shader XYZ. Or maybe they could include a set of higher-polygon models to kick in when you get within a certain range of the object(s) in question. Sure, you can't notice the subtle armor details are just flat tack-ons when you're 20 feet away from them, but if you're holding an item or someone is in your face and attacking you with it, you probably will.

As far as performance goes, things can always be added as options to be enabled or disabled as the user desires. One thing I love about PCs is that if you get a game and graphics card in 2003 and purchase a new graphics card in 2006, you can expect several times the performance you had before. Higher resolutions, higher graphics settings, higher framerates, all at the same time. And hey, what with dual- and quad-GPU setups that are emerging, you might not even have to wait. You can increase your performance several times by adding new cards.

Or perhaps think of it this way. What would things be like if, years ago, game developers had said "Well, we don't really have to make 3D models, we can just stick with sprites for now. Performance with real 3D would be much lower, and nobody will ever notice the difference." Let's be glad at some point someone decided to go for quality over performance.
 

SteveO

Distinguished
Dec 21, 2001
75
0
18,630
Rob Wright asked the question is it worth it? My opinion is no its just that an opinion. I explained my reasonings behind my opinion. U cant plug a gpu(no cpu,motherboard,controllers) into a tv and start playing ucan with a console(gpu,cpu,motherboard,controlers). They come out with new graphics cards constantly. Dont get me wrong i prefer pc graphics over console graphics anyday. Oblivion is a great game i play it, my wife plays it, and my kids play it. Am i going to go to go out and buy yet another graphics card to improve the quality of Oblivion, Is it worth it? To me, No, i have bought two graphics cards this year already and its not june yet. What game will be next to demand even a more powerful gpu and how soon we dont know. If i had a huge budget for my gaming habit id have the badest of bad Graphics card to run Oblivion on. Unfortunatly im a blue collar worker. Most people are. I say again to Robs question is it worth it? No
 

Agraza

Distinguished
Apr 27, 2006
12
0
18,510
Still wondering why the review involved two different generations of graphics cards. The first thing I thought of was some ATI favoritism.

This wouldn't have occurred to me before, but we had some retard write about WoW on Tom's Hardware recently, which was also disgusting. Nobody thinks McDonalds is the finest dining in the world because it serves billions of burgers, yet unfortunately some people who've access to write articles at places I normally read do think WoW is the pinnacle of quality because of that very thing.

What's the deal with the 6800 being put up against the X1900? How can that be an honest mistake? Did you not have any 7800s or 7900s around? I recognize that ATI has a better card out right now, by quite a bit in-fact, but despite my previous loyalty to nVidia I would not compare the last generation of ATI against a 7900 GTX.
 

Agraza

Distinguished
Apr 27, 2006
12
0
18,510
I'm specifically curious about the comparison shots which I recall being always 6800 GT and X1900 XTX.

The Catalyst 6.3 and the Radeon X1900 XTX show nicer-looking water effects than does the GeForce-based competition. HDR effects are also good looking, but individual figures remain subject to errors in their shadows and lighting.

http://images.tomshardware.com/2006/04/26/graphics_cards_over_the_edge/oblivion-geforce-vs-radeon-1big.jpg

http://images.tomshardware.com/2006/04/26/graphics_cards_over_the_edge/oblivion-geforce-vs-radeon-2big.jpg

Except the 6800 GT isn't the competition for the X1900 XTX. The point of these is...what? I would be as suprised to find out the 7900 shows better video quality than a 3dfx voodoo.
 

Agraza

Distinguished
Apr 27, 2006
12
0
18,510
Ah no. I am exactly on the mark. They're comparing ATI and nVidia, regardless of generational differences. I've read the parts of the article discussing different generations. This isn't one of those parts. This is a comparison of two different companies.

I quoted the sentence that makes this the point.

The Catalyst 6.3 and the Radeon X1900 XTX show nicer-looking water effects than does the GeForce-based competition.

The 6800 GT is NOT nVidia's response to the X1900 XTX. It is not the geforce-based competition. That sentence is a lie, and those comparison shots are worthless.