Sign in with
Sign up | Sign in
Your question

"Far superior" image quality with 8800GTX?

Last response: in Graphics & Displays
Share
November 9, 2006 9:55:01 PM

Could somebody explain to me how the ATI X1950XTX's image is any better or worse than the nVidia GeForce 8800GTX's image on this page? The article suggests the 8800GTX's image quality is "far superior" - I absolutely do not see that!

[EDIT: Keep in mind I'm talking about the Oblivion images specifically - the ones at the bottom of the page.]
November 10, 2006 3:19:46 AM

i saw some comparison on hardcop between ATI X1950XTX' and the 8800s in fear with 16x AF and i saw differences but i can find the article again sorry
November 10, 2006 3:57:04 AM

I found it but I don't know if I'm allowed to link to it - it's right on their front page at the moment. They have pictures comparing anisotropic filtering methods. The 8800GTX is a massive improvement here, and I could hardly stand to play a game with it looking like it does on the Radeon. Seriously, when I upgraded to an nVidia GeForce 6600GT from a GeForce4 Ti4200 I was taken aback by the poor quality texture filtering. Why should games look worse with a new card? I almost wanted to put my old card back in, but I'd already given it to my dad...

So yeah, I'm glad nVidia has found it in their hearts to get their crap (back) together with the filtering, as they've had their issues with it as well. I think the reason is because nVidia and ATI were in such competition that they thought they had to make painful optimizations in order to stay ahead, but with this new hardware it's just not necessary - it's fast, and they've got ATI beat for a while!

I shouldn't hold my breath on that one, though.
Related resources
November 10, 2006 4:26:11 AM

One is produces a brighter picture....thus the textures are more visbile and it looks cleaner....you have to look at the hill in the background.... I still wouldn't call it a "Far Superior" difference....but it is slightly noticable.
November 10, 2006 4:30:11 AM

I don't see how that can't simply be indicative of differences in the (in-game) time of day when the screenshots were taken. That's what I thought it was when I saw it, especially because the vegetation/sky didn't look the same either.

I think it would be helpful if the screenshot were taken at a more stable time of day. Now, if you compare the tree in the very right of the pictures, you'll see that more light is shining on it for the 8800GTX. Why would this be? Shouldn't the same amount of light be shining on it regardless of what card is used? Or else how can game developers depend on things to turn out as they want them to turn out? Admittedly having no professional experience in these areas, I still think this points to different-time-of-day. How could there be that much variation otherwise?

And if not, I'd like to know what makes it look different. HDR? It doesn't seem HDR would affect how lighting is applied to that tree, or would it? Is nVidia using some more "accurate" form of lighting..? Couldn't that interfere with some games, if so, as they depend on lighting working a certain way to be experienced as intended?

I just don't think we have enough information to determine if the 8800GTX's image quality is "far superior" or not from that example in the article. What's supposed to be superior? Because if it's the lighting, I just don't buy it.

Also, and this goes along with my post in the feature requests forum, I think the textures for the 8800GTX look "dirtier"! JPEG compression artifacts seem to plague them more, which has nothing to do with the card at all but hinders our ability to judge image quality.
November 10, 2006 5:42:40 AM

Actually, you can't compare really without renderization. It's not just about the quality of a screenshot, but FPS and how many textures the card can create. Current games and software aren't really developed for DX10 yet. If you look at the card SPECS, you'll see that it has the capability to way outperform current DX9 cards.
November 10, 2006 5:43:59 AM

Yes, yes, I know. But that doesn't distract me from the fact that "far superior" image quality is touted by THG without this being sufficiently proven, at least in that one example :p 
November 10, 2006 5:45:17 AM

Hell yea I see a difference, not a big one to justify a new purchase of a GFX card but I see a difference. The 8800GTX seems that the leaves on the trees are much sharper and smoother as well as the weeds, no more jaggies! :D 
November 10, 2006 5:50:25 AM

I see no jaggies in the X1950XTX image. Are we looking at the same pictures? In fact, the leaves look sharper and cleaner to me in the X1950XTX image due to the artifacts in the compression of the 8800GTX image.
November 10, 2006 5:53:35 AM

They say that the human eye can't see over 90fps anyway...your eye just isn't that quick. If it was, old tv's would be rendered USELESS, as would film reel movies. When your optic nerves and synapses are fast enough to see individual pictures, you need more pictures or FPS to fill in the gaps, so it appears as motion. Like a flip-book being flipped really slowly. Anyway, my point is that current apps running on a 7900GTX aren't going to look that much different than current aps running on an 8800 because the frame rate is already pretty high. The new cards will allow massive physics calculations, more realistic multi-action. If you move a box across the screen using a 7900GTX card or an 8800 GTX card, they're both going to have high frame rates and look the same. But if you add gravity and about 300,000 boxes in many many colors, the 8800 will be a big big winner, if the 7900 is even capable of rendering that. From my understanding, the new cards are about 3X or so more powerful than today's cards. That's not to say you're going to notice a sudden 3X improvement. But combined with fast quad core systems and plenty of RAM and better hardware utilization...it's really all coming together and I'm sure within the next few years that games like FEAR and Oblivion will look quite dated. Scary.
November 10, 2006 5:58:39 AM

Sure, but of course in a screenshot FPS really makes no difference, and that's what Image Quality is about (or at least part of it). Also, the 8800GTX is approximately 2x better-performing than the X1950XTX at its best, not 3x. And I could be wrong, but I think 300,000 boxes could be rendered with no problem whatsoever on cards generations old. But more to the point - can anyone prove that there is a significant quality difference between the two images produced by the cards? Because given those two images, I certainly can't.
November 10, 2006 6:11:30 AM

actually the human eye can see single frame differences out of thousands played. Tests were done on fighter pilots and they successfully identified a single frame shown in a blur of 2000fps or so.

fps is a crazy subject. A lot of people claim the more the merrier. But in reality when you have a super good monitor and can run vsync your frames are significantly crippled. But does the gameplay seem slow? no.

How many FPS can the human eye see?
November 10, 2006 6:24:58 AM

who cares to some SHARPER TREES & Brighter Building when you drive at 320KM/H with a Mercedes MClaren SLR in Need For Speed Carbon ?or When you Dodge the Bullets in F.E.A.R or Quake or Battlefield 2142 ??... and who cares about some Brighter field when you have a Army of 100Units in Dawn Of War Dark Crusade...??
come on Guys... i see no point in image Quality of 8800GTX that makes it better than any X1900 Series cards...
and i`m so dissapointed with 8800GTX Performance compare to a X1950XTX ,there isnt much FPS difference in F.E.A.R...
only fools will buy 8800 series cards in these prices when you can buy a R600 card in a few months...
November 10, 2006 7:35:34 AM

so you say ? dont buy the 8800 but buy the R600 ? lmao O.o


i dunno how much u can see the difference but maybe in a bigger picture you can, well i sure dont mind they make the image quality better its just a +
November 10, 2006 8:48:56 AM

Just buy what's in your budget, almost any newer card is likely to be an improvement. Most smart people never waste money on brand new hardware tech, as it's not really being utilized by software and is overpriced. You're kind of stupid if you run out and buy the very first of any product of any kind anyway - without a test period to see how well it works on the masses, how can it really be recommended? Look at what happened to the original purchasers of the original Xbox... :(  I'm glad I waited, mine still works!
November 10, 2006 9:41:44 AM

The better quality is shown in the DX10 render demos. The DX9 rendering only shows little improvement regarding AA and AF.
November 10, 2006 9:47:42 AM

Your calling enthusiast stupid ? lol, whatever if i had a problem with my 8800 i would send it back period.


In my country the things cost more and if i had to buy a 8800gtx it would cost me $100 more than it would in US and our cars cost 4x for the same car

If something is wrong i have 2 years to send it back thats how it is, atleast in my country

dunno i dont give a damn about consoles =P so i dont know what happened
November 10, 2006 10:04:17 AM

What I was calling stupid was paying premium prices for things...not all of us scream like a little school girl and demand candy or Prada. :-) Some of us WORK for a living, others of us (like you) seem to take life for granted. I can only hope that world war 3 comes sooner than later, and that people wisen up and quit allowing the few rich people in the world to set all the rules. If you don't know what happened with early Xbox's then you probably don't know about the recent exploding laptop batteries...there's just something to be said for not buying the latest and greatest technology. Same goes for medicines. You know that there are vaccines that actually caused people to contract the virus they were trying to be protected from? They DIED! I rest my case. At least until some other idiot opens his mouth. (fingers)
November 10, 2006 10:16:53 AM

To tell you the truth, i can barely see a difference between the two images, and frankly, i prefer ATI's output a little better due to the "warmer" colours.
November 10, 2006 10:28:39 AM

if ppl want the best stuff they have 2 pay for it, nothing wrong with that and if u dont want to pay for it just dont - but dont tell ppl what 2 do with there money

well good for you and wauw looks like u already know, your one smart guy


but i want a education thats why i go 2 school and ur probably older than me so i cant see why ur being such a smartass

and yes i do know about sony and as i said i dont care about consoles period.

so u think buying new hardware is the same as medicine? lol.
November 10, 2006 10:57:44 AM

This is at least the 2nd thread compaining about how "not great" the 8800 is. Maybe if you actually bought the card it would be easier for you to notice the differences rather than going by a review and benchmarks. At least the AMD fanboys didn't create threads stating how "not great" the Core2Duo is compared to older Athlons. You're killing me with this, if you don't think the 8800 is worth having, then don't buy it! Sheesh...

+1 post for me!
November 10, 2006 11:07:38 AM

i totally agree, especially when ppl start picking on others cus they have the money 2 buy the card.
November 10, 2006 11:40:40 AM

I would have to agree that there really doesn't seem to be much of an image quality difference here. I flashed between the two and other than the already mentioned lighting difference on the right hand side and some other very minor differences (not differences in quality either) I can't tell any difference at all. I surely could not see justifying buying a card for 800 just to get a few more fps especialy when the current directx 9 cards are doing fine with it. I personally am going to wait until there are many game titles out using directx 10 and for Vista to get at least somewhat patched up before I make that transition. It will be about 6 months I am thinking so perhaps the improved of this card (8850 maybe?...remember the 7900-7950 and the 1900-1950?) will be out by then and it will cheaper too! No advantage of this card right now to be able to run a direct 9 game faster when the current cards are already running them fast as hell. My $.02.
November 10, 2006 11:45:16 AM

Quote:
Could somebody explain to me how the ATI X1950XTX's image is any better or worse than the nVidia GeForce 8800GTX's image on this page? The article suggests the 8800GTX's image quality is "far superior" - I absolutely do not see that!


I honestly cannot tell a difference. (This is why I don't trust reviews from Tomshardware, they make claims that seem quite unsubstantiated) I've always been a fan of nVidia but a bigger fan of ATI. *shrugs* I've just always been more pleased after I plugged in my ATI card rather than nVidia.

Just because I cannot tell a difference in a single still frame doesn't mean that "omgz! this card suxx0rz ATI will pwn it! lol!!11!!oneone". I just want to see what ATI has to offer before I start making judgement calls.
November 10, 2006 11:53:28 AM

Tom's hasn't had very reliable review information for a couple of years. Ever since Tom left as editor it has slow digressed in objectivity and writing quality.
November 10, 2006 11:59:58 AM

I would definately agree with the guy above me about his comment on Tom leaving....I noticed the writting quality has gotten really shi**y but did not know Tom left as the reason why!
November 10, 2006 12:18:21 PM

I think the wording "Far Superior" is a little much. I do however see a difference in the comparisons I've seen.

I think the comparisons I've seen for the most part are shadowy pics (not sure if the same as the ones you mention) and are fairly poor examples because its hard to see any detail much less the difference. But there is a difference and even tho to me its not "Far superior" it is still better which is great imo if not mind-blowing.
a b Î Nvidia
November 10, 2006 12:23:21 PM

some games seem to have worse filtering than others....but it can not be seem in screens... maybe this is the same thing.....in that case they need some super high res moves....

I am just taking a guess...anyone ever notice dotie/crappy textures in the distance on some games?
November 10, 2006 12:25:19 PM

honestly the thing that always stick out the most for me is the jagged lines along walls and such

If I'm not mistake that's where antialiasing comes into play?
November 10, 2006 12:53:06 PM

Your not going to see a great deal of difference because the software (games) have not been progamed to take advantge of DX10 yet! So what you are seeing is still todays standards. It's like you have a super charger ( ATI X1950XTX's and nVidia 8800GTX's ) and are using mid grade gas! Get it??
If you all would ez up on the caffine, and just wait until everything is in place to take advantage of these cards then you would see a difference!
November 10, 2006 1:05:01 PM

Wow I agree some folks need to ease up on the caffeine, a simple question on image quality leads to personal attacks on each other. Frankly I don't see much difference in the images, but without the early adopters companies would have little or no incentive to put out cutting edge new products. So I for one applaud those who have the money to buy the fastest most cutting edge hardware. It allows the hardware companies to do the research to put out products the rest of us can afford. Of course the new NVidia card should look better than ATI's older card, of course the new core2 duo should be faster than AMD older 64 bit processors, neither makes me want to run out and buy them, simply because it is the latest technology and is a little better. But when they become mainstream and the prices drop I am glad the early adopters have tested the new hardware for me already.
November 10, 2006 1:10:39 PM

There is a significant difference if you put bothe pages up and look at them closely.The 8800 has better hdr and shading capabilities.Seriously,look at them and you'll see what i mean.

Dahak

AMD X2-4400+@2.4 S-939
EVGA NF4 SLI MB
2X7800GT IN SLI
2X1GIG DDR IN DC MODE
WD300GIG HD
EXTREME 19IN.MONITOR 1280X1024
ACE 520WATT PSU
COOLERMASTER MINI R120
November 10, 2006 1:41:47 PM

I couldn't really see a difference either...
November 10, 2006 1:54:09 PM

Quote:
There is a significant difference if you put bothe pages up and look at them closely.The 8800 has better hdr and shading capabilities.Seriously,look at them and you'll see what i mean.

Dahak

AMD X2-4400+@2.4 S-939
EVGA NF4 SLI MB
2X7800GT IN SLI
2X1GIG DDR IN DC MODE
WD300GIG HD
EXTREME 19IN.MONITOR 1280X1024
ACE 520WATT PSU
COOLERMASTER MINI R120


I did, I pulled them up side by side on dual monitors I have here at work and I honestly couldn't tell a difference... *shrugs* regardless of wether or not I can tell a difference I still wouldn't go out and buy a G80 because 1) it's brand new 2) DX 10 isn't released yet 3) vista isn't out yet 4) not effecient in terms of what I can see vs. what I pay and hope to see
November 10, 2006 2:15:31 PM

Unless the monitors were 24" Eizo's, I think you're all urinating up a rope.
November 10, 2006 2:49:41 PM

The 8800 GTX has sharper and crisper visuals. I agree that you need a high quality monitor to take advantage of the superior image quality. Also, the 8800 GTX produces a more vibrant differentiation in colors, making individual grasses far more noticeable, in my opinion. Either way, Oblivion would look beautiful on the PC. If you look at the average FPS though, the 8800 GTX is nearly 100% faster at 2048x1536 and about 50% faster at the other resolutions (except for 1024x768, which is only about 30% faster, but still, damn!)!

Oh, I was talking about outdoors, for the record.
November 10, 2006 2:56:09 PM

Quote:
This is at least the 2nd thread compaining about how "not great" the 8800 is.

NO IT'S NOT!

That wasn't my intention at all. God. Did you read my opening post? It's not my fault people started talking about FPS, etc. The point is that the difference between the two Oblivion images is negligible yet THG calls the 8800GTX's image "far superior".

NeonDeon: I don't consume caffeine.
Most of the rest: you're completely missing the point...
My point was that it looks like they loaded up the save with one of the cards and took a screenshot right away, whereas on the other card they got up to get a drink of - well, let's see COFFEE since we're so into accusing people of drinking it - and when they got back it was later in the day in-game. This would easily explain the lighting difference, and that's exactly what it looks like to me from these shots, which frankly makes the claims of 8800GTX image quality superiority (at least in the comparison of these images!!) seem BS. If it weren't for some of the images I've found on other sites, that's what I would think.

Now I'll acknowledge that it's just my opinion and some of you actually think the 8800GTX looks crisper and cleaner - it probably does, and I have seem some shots which point this out, but I don't find this evident in these Oblivion images here at THG at all when you dismiss the lighting difference which seems obviously unrelated to the capabilities of the cards. I could be wrong, but if you look at the angles of the light shining on the rocks and such, they're different between pictures, hence different time of day, hence darker in the X1950XTX image..

Anyway, I've already said all this. :lol: 
November 10, 2006 3:16:31 PM

I GET IT,i GET it.Point well taken.I put both images side by side on my monitor and found little in any difference.I have seen differences far greater in comparing rendering in other reviews,mostly going to Ati's gpu's.Looks like nvidia's finally caught up,at least til r600
November 10, 2006 3:43:53 PM

Read this on the human eye. Movie frame rates are different than computer monitor frame rates. They cannot be compared with each other.
November 10, 2006 4:00:05 PM

Aren't you going to essentially be limited by the refresh rate of the monitor. Assuming you're going at 60-85 Hz anything above those numbers in FPS wouldn't display on the monitor, it would just be overwritten by the next frame. Maybe some people got fancy monitors with better update rates.
November 10, 2006 4:17:55 PM

Past 30fps, you can't tell a signifigant difference. The optic nerve has a finite rate of interactions per second with the brain (we will say its a 1024 bit bus or something :D  ) and even though your eye can "see" things at very high fps because the "pixels" in your eye have no real refresh rate, the brain cannot get them and process them quick enough to make a difference. The previous post about fighter pilots can be explained: the pilots were shown a grey screen and then a very fast (~1500 fps frame) was flashed. The contrast was picked up by the eye and it said to the brain: "Whoa, wha?... Whoa. Dude.... That was sick... Dat's da $#!t, man," and the brain was like, "Far out..."
-cm
November 10, 2006 4:30:41 PM

I agree with the subject this thread was made (and not the insanely horrific flaming spiral of crap it has subsequetly taken)

The words "Far superior" are irresponible...I would say "slightly sharper" at BEST.

When I first looked at the pics I thought "there's no difference at all"...

Even now, I have them BOTH open, on my dual 22in LCDs (one in each monitor) and I see little difference if any...

ANYWAYS, you are silly if you buy virgin technology in its first run...SILLY SILLY SILLY...wait till the drivers get better, DirectX 10 gets crisper and wait till they cards get a 1 gig of memory....768 is just stupid...soon they'll ALL have 1 gig...HECK AT THE VERY LEAST WAIT TILL ATI RELEASES THEIR CARD JUST FOR COMPETITION'S SAKE AND A PRICE DROP!!!!!
November 11, 2006 2:30:31 AM

Ok, so the Oblivion pics in the article obviously show no real difference. It seems we all agree on that. MOST of us prefer to wait to see how hardware works before running out and spending half a grand. Those of us who are still in highschool and have daddies in the middle east who own a good share of oil...well you're welcome to buy all the American products you want. Canadian or otherwise, it's all the same. Gogogo. It's pretty much agreed that software has yet to truly utilize 4 gigs of RAM, multi-threaded CPU's and DX10 cards. I mean come on...nobody is even utilizing 64 bit yet. Shh! Nobody!

As for what DX10 will do, it won't "revolutionize" gaming as we know it, but it's bound to change the level of detail considerably in the future as games are written for it. So much is happening right now with DX10, 64bit, multi-core CPU's, hard drives going flash, Blu-Ray and HD-DVD, etc...that we all can't wait to see what games will be like a mere 2 years down the road. Hell, we don't even know what the PS3 is really capable of yet. High-def tv's are just now really beginning to find their ways into middle class homes too (the majority here in America). Point-of-purchase is always a tough call, a great deal one day is old tech the next. But new old tech is always better than old old tech. Catch me? In light of DX10, I'm still addicted to playing StarCraft online. For those of you who don't know, it came shortly after WarCraft 2, and it's the best damn game EVER...cept for Halo, which also has cheaters online. Bleh. Getting rid of cheaters in online games > DX10. StarCraft came out in mid to late 1997. It's 9 years old and I STILL love it. I should really play Oblivion but I just can't put StarCraft down...gotta....go...play now. :-D
November 11, 2006 3:52:57 AM

Listen.. if it doesn't take my p40n and make my girlfriend jealous.. than it just isn't worth the money for the 8800mixerplix.
November 11, 2006 3:57:37 AM

If your GF is hot, I can make YOU jealous...
:twisted: Muhaha.
November 11, 2006 4:51:58 AM

How you think the R600 will be better than the 8800 ?
November 11, 2006 5:48:58 AM

I see a difference between the 2, better yet the 8800 is DX 10 ready so its a far more superior card than th XTX.
November 11, 2006 6:38:49 AM

Spec wise, it's like twice as powerful...boy people sure love to speculate...why not just wait and see? LOL. No wonder we have so much hype.
November 11, 2006 8:12:49 AM

To me, the quality difference between the x1950 and 8800GTX is the same as the difference between the 7900GTX and the x1950.

The x1950 is better than the 7900, the 8800 is better than the x1950, but there isnt alot in it, and its barely noticable to normal people
November 11, 2006 8:49:36 AM

i still say that buying a 8800series card is not a good idea... why ?
because something like 8900GTS (that is faster than 8800GTX & Cheaper) will come in a few months (like 7900GT) & you will cry because you spend 1200$ on a 8800GTX SLI setup...
anyone of you with a 79xx series or X19xx series can play all the new games well ,so why you want to spend a lot of cash on the FIRST directx10 card ??
at least wait until ATI release the new DX10 Cards...
whatever,its you money! :roll:
!