Sign in with
Sign up | Sign in
Your question

*Reward for TRUTH*does nV.skipframestogethigerfps?

Last response: in Graphics & Displays
Share
November 24, 2007 4:57:00 AM

I here from time to time that the reason ATi "seems" slower is because nVidia skips frames to get the higher fps. If this is true, is this the reason why ATi has better image quality? OR are they the same. I'm looking for links to PRO-level sites, that say that nV. does in fact do this. Also for links to sites/reviews that say ATi has better image quality... something I have heard from the first day I got into gaming. I ditched the VooDoo!

The reward is a large format full color photo on high gloss premium photo paper (42"x42"), of your... whatever.
The details can be worked out later, in private. Right now I need to see the proof before I pay up. The reason I'm rewarding for truth is that people may put time into doing this. Time I do not have. I want to see proof with my own eyes, that: 1) - ATi has better image quality than nVidia. OR not!
2) - Actual proof that nV skips frames to achieve higher fps. OR something to the effect that it's a rumor.
*** Not just forum babble. ***
Either one of these will earn you the reward.
November 24, 2007 5:23:07 AM

OH... No matter what the results turn up, I want the truth.
November 24, 2007 5:42:26 AM

ATI:
Nvidia:

ATI definitely renders more green!
Related resources
Can't find your answer ? Ask !
November 24, 2007 6:01:49 AM

So... are we looking at Far Cry on top, and Cryisis on bottom?
Pretty neat color comparison either way.
November 24, 2007 6:33:22 AM

Yes, and as you can see in the lower right corner, nvidia clearly skipped more than two frames! And look how blue the water looks with ATI... You cant even see any water with Nvida... once again they are cheating!
November 24, 2007 11:19:31 AM

Actually the Nvidia drivers don't skip frames because that would not help the overall score. They actually replace every 30th frame of lush tropical island with easy to render plain text. The benefits are not only higher frame rates but add revenue by selling the 30th frames as subliminal advertising. In pre-concluded studies like yours, these adds have been shown to be 223% more effective than 42"x42" glossy adverts.

More seriously if you are referring to the recent "driver optimizations" with the Crysis demo, they were found and reported here: http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=487&Itemid=29&limit=1&limitstart=0

One site that includes a lot of IQ comparisons in their reviews is http://www.hardocp.com/. Generally the lead that ATI had in IQ has been leveled for about a year or since the release of the 8800's.

I can't recall where off hand but there was some site someone here linked to claiming inferior rendering of Gears of War with Nvidia. I glanced over it but it didn't seem too convincing. After blowing up one small section of some frames it still wasn't obvious what I was supposed to be seeing.
a c 130 U Graphics card
November 24, 2007 6:04:31 PM

To be totally honest with you its all relative i could show you screen shots all day that "prove" one superior to the other or vice versa depending on the site i took the screen shots from,some seem to have a bias one way or another. Have you ever walked around an electrical store and wondered why no two pictures on the tv's looked the same? You have numerous Gamma/contrast etc settings and to find two exactly the same would be amasing so for that reason i dont think you can rely on screen shots to make this kind of decision,it would be like trying to show someone a 22 inch HD screen shot on a 15inch lcd and expecting it to be the same, its just not gonna work.
Personally i trully beleive that ATI cards have a better picture quality than the nvidea but that is all i can say i cant prove it to you so i guess i wont get the 42x42 but thats not why i posted anyway.
As for nvidea "cheating" with the drivers well lets just say that they have had a coulpe of bad optimisations(floptimisations) as some call them pointed out to them latley that had the fortunate side effect of slightly increasing frame rates.
Good luck
Mactronix
November 24, 2007 7:17:39 PM

Pretty old discussion. There has been many articles at Tom's about this (when Tom actually worked here). Nvidia was caught "cheating" many times with trying to use a single light source on a multi light source scene, less quality images to get better fps, etc...the list goes on and on. No way I can find all of those old articles now adays. But here's a few for ya.

http://www.tomshardware.com/2004/11/29/the_tft_connecti...
http://www.dailytech.com/AMD+Alleges+NVIDIA+Cheats+in+H...
November 24, 2007 9:47:39 PM

sooo did I win? Im the only one who offered proof to your own eyes!
November 25, 2007 3:29:57 AM

I did not see any indication of frames being skipped, by looking at an old screen shot of Far Cry, and a new one of Crysis. If I could see the same screen shot of Either, 1 of ATi, 1 of nV.- same frame, same settings on the monitor and in the control panels(default usually works just fine.) Maybe.
Also it is not actually the kind of proof that I'm looking for. I think I mentioned "no forum babble". Please show me the links to where a paid PRO. says one thing or another regarding my inquiries. You are on the right track here. I could be persuaded by image comparisons. Again... not by Jo-computer enthusiast, professionals on sites like Tom's. I don't have the time. Obviously, or the resources. I figured someone on this forum would know exactly where to point me.
I must admit I am not going to just pick ONE lucky winner."LOL" It's not so much a contest, but a method of finding out the facts. I can print pics. all day and night. I just figured that if I throw some incentive out there, I'd get results from the many informed members of this forum. I am not posting on any other forum. If I find that computertech82, and skittle have given me links to facts regarding this topic, they can both have a reward for helping me. Plus it's fun to give... instead of argue and flame etc...
a b U Graphics card
November 25, 2007 5:44:03 AM

for all the issues about optimizations/drivercheats/orwhateverissueyouhave. does it really matter? i mean did crysis looked ugly after the optimized driver from nv? it looked different but was it ugly?
November 25, 2007 6:27:43 AM

I saw some comparsion and yea... ATI did look a little better.
November 25, 2007 3:20:21 PM

I have actually seen the iax-tech review. It was among the first I read about the cards. They are not professional. They can't even edit their text/article before printing it. I felt like I was reading something throw together by 8 year old kids w/ no real knowledge about the industry. I saved the link and copied the text where they say nV. looked worse in "one" part of "one" test. It is nothing to me though, as they really do seem like they do not know what they are doing. In fact, compared to all other review sites, their numbers are too far off. I found 2 German sites that reviewed the cards w/o bias. They show the tests at 8xAA, and 16xAF at all playable resolutions. The 3870 was stomping the GT by leaps and bounds. EVEN the GTX and Ultra in most tests/games. Only at the 8X,16XHigh res. settings though. Thats it what made me wonder about the I.Q. of the 2 latest vid. card offerings. I hear rumors only. The Inquirer's writer for that article is only spreading hearsay about HD playback. He didn't do any testing of his own, then post the results. He just took a very small part of the whole and expanded on it. I must give points for the link though. Very informative (although admittedly simplistic). Thank You.
November 25, 2007 3:57:31 PM

nevasumma said:
I found 2 German sites that reviewed the cards w/o bias. They show the tests at 8xAA, and 16xAF at all playable resolutions. The 3870 was stomping the GT by leaps and bounds. EVEN the GTX and Ultra in most tests/games. Only at the 8X,16XHigh res. settings though. Thats it what made me wonder about the I.Q. of the 2 latest vid. card offerings.


That deserves a link.
a c 130 U Graphics card
November 25, 2007 4:26:11 PM

Well these should keep you busy for a while little in the way of pictures but great content i feel and as there is an article for each camp you can judge any bias while reading them i think these guys are pretty sound though.
Please bear in mind when they were done though as obviously there have been a few driver changes since the Nvidia one.

http://www.beyond3d.com/content/reviews/47/1
http://www.beyond3d.com/content/reviews/3/1

Mactronix
November 25, 2007 8:36:56 PM

I have been trying to recover the links. I WILL post them once found... I promise. Gotta go for now.
November 26, 2007 12:20:45 AM

This whole topic is a bit ridiculous. It has been proven in the past that Nvidia does attempt to find shortcuts to provide better performance.

Sometimes it pans out and you don't see a difference. Sometimes it doesn't and degrades the visual quality. All video card vendors have been doing this since the Voodoo days. The only thing that has changed is that video card vendors don't attempt to differentiate their cards by developing their own version of the manufacturer's drivers anymore.

Furthermore, being that driver updates are constant and erratic in terms of what their goals might be, trying to figure out who is better is quite subjective.

The only real conclusion that can be made is that to find direct correlation with said statements, each and every driver update must be tested thoroughly. The op needs to realize that the "pro's" and "geeks" aren't always the best at journalistic disciplines. Some of the better opinions happen to be carried by forum members who are not paid to give their opinion. Otherwise, only your opinion matters. Don't make these guys try to persuade you that they're correct.

Btw, many of the more technically inclined members have left this site after what it's turned in to.
November 26, 2007 2:17:56 PM

Not trying to figure out who is better. I know that answer already. I'm looking for what the first post says I'm looking for. I didn't think it would be so hard for people to prove their allegations about nV. fps/I.Q. differences. Not to mention "cheating" on drivers, which is news to me. Basically I want to know if it is true that nV. does "anything" to "cheat" there way to the higher fps.(at this point) AND of course a professional assessment of I.Q. over the whole range of the cards abilities, not just HD playback(which we all know is a new feature that only ATi offers... right now). I do not think this topic is ridiculous, as many have expressed a new interest in finding out why soooooo many have claimed I.Q. to fall in favor of ATi! It matters plenty when your me. I have a graphic design studio/custom home design and drafting office, so IMAGE QUALITY MEANS EVERYTHING. Even the slight differences are noticeable on a large format photo/rendering. OR in games which I admit I am addicted to. I feel like I am the only one who wants what he pays for here. I am not interested in fps... personally! Since I can run everything at max quality settings with very playable fps. I don't need to see 120fps, when 60 is just sweet. I DO want the best though, and am in the market for something new. Not anytime soon, but eventually.
- Thank you again Mactronix for the links. It helped some. I have read them some time ago, but did not recall the technical differences between the 2 cards. It does point to the direction of nVidia for higher I.Q. in games though, which is actually the 1st. time I've ever heard such a thing. (baffling)
November 26, 2007 3:18:38 PM

I dont know if anyone has linked these yet, but this is a nice image quality comparison of Need for Speed: Carbon and STALKER on various Radeons and the Geforce 8800gtx. Click on the two pictures at this page and watch the differences as they flick through the graphics card types:

http://www.pcgameshardware.de/?article_id=621293&page=1...

November 26, 2007 4:04:02 PM

nevasumma said:
I have a graphic design studio/custom home design and drafting office, so IMAGE QUALITY MEANS EVERYTHING. Even the slight differences are noticeable on a large format photo/rendering.


Then why are you looking at CONSUMER Cards? Why are you not looking at PROFESSIONAL graphics cards?
November 26, 2007 4:31:48 PM

I thought firegl and quadro cards are intended for 3d modelling? if thats what you mean, or other than that there are matrox cards, but I think they get super duper expensive (thousands of #insert local currency here#).
November 26, 2007 5:54:39 PM

and autocad and all professional applications.

And of course theyre expensive, theyre professional cards.
November 26, 2007 6:16:55 PM

skittle your such a n00b
November 26, 2007 6:21:35 PM

indeed!
a b U Graphics card
November 27, 2007 6:58:38 AM

nevasumma said:
Not trying to figure out who is better. I know that answer already.


Obviously not based on some of your own statements.

Quote:
Not to mention "cheating" on drivers, which is news to me.


Why not mention it? That's part of people's reasoning when talking about IQ, because of the sacrifice of IQ for speed, especially speed in benchmarks.

Quote:
Basically I want to know if it is true that nV. does "anything" to "cheat" there way to the higher fps.(at this point) AND of course a professional assessment of I.Q. over the whole range of the cards abilities,


What kind of test would satisfy you? The driver issue is pretty well covered as NO1sfanboy showed, just follow the links. As for his mention of the Gears of war issue, that's here;
http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD%27s_salvation%3F/5392-14.html

and likely the only one that still holds up to scrutiny, but note it's only a very VERY minor difference. Their initial comparison involved different scenes for their crysis comparison, but now they've finally figured out that to test something you need the same scenes at the same settings. This update now compares the same frame/settings.

Quote:
not just HD playback(which we all know is a new feature that only ATi offers... right now).


They both offer HD playback and have for a long time. And your comment of 'right now' implies that something will change that doesn't involve the hardware differences. Sofar those types of changes haven't happened other than the floptimization for the HQV benchmark.

Quote:
I have a graphic design studio/custom home design and drafting office, so IMAGE QUALITY MEANS EVERYTHING.


But has nothing to do with this.
What app are you using that this has come into question? Do you even know which apps used to have the biggest difference? Do you know why the difference is greater between last generation and this generation than between the current ATi and nVidia cards? For the reason you say it matters, you're sure asking the wrong questions and focusing on the wrong areas and apps and products.

Quote:
- Thank you again Mactronix for the links. It helped some. I have read them some time ago, but did not recall the technical differences between the 2 cards. It does point to the direction of nVidia for higher I.Q. in games though, which is actually the 1st. time I've ever heard such a thing.


Well unfortunately those links are out of date, just as much as if I linked to an X1K vs GF7 or R9700 vs FX5800 or GF4 vs R8500. Things are pretty much even right now, and the minor differences are just that, rather minor. Even the early issues with G80s improved and the early issues with the R600 (some similar to the early G80 issues) have improved, including the AA situation with the addition of more custom filters for the HD2900's AA. The G80 still has the theoretical AF advantage, but it's expression in real terms is even less than the issues posted above.

Right now, the problem aren't about stand IQ so much as concerns about floptimizations and such things that essentially hurt the good standard IQ of both companies in order to get 1-2 fps advantage vs the competition. The issue wouldn't be so bad if they were know OPTIONS instead of hidden defaults or floptimizaions which the user has no control over. As an example, having the ability to run in DX9 mode instead of DX10 mode is great to allow you better performance at higher resolution and such, however if the user calls for DX10 mode but is given DX9 mode with a few tweaks, simply because the IHV thinks we don't need the added IQ/features versus additional frames, then that's a problem. We should have ultimate control over the quality. And just like the initial Catalyst AI issue, it's better to have the option for heavily optimized, light/balanced optimizations, and off ; so that you can get the best of all features.

Overall they're pretty even with their IQ, with just a few subtle differences, and unless you know where yo look and why, then don't worry about it and just enjoy the game. As for the 2D professional quality, you might want to check the requirements of the apps you use and which cards fit them best.
November 27, 2007 4:59:48 PM

No1sFanboy said:
Actually the Nvidia drivers don't skip frames because that would not help the overall score. They actually replace every 30th frame of lush tropical island with easy to render plain text. The benefits are not only higher frame rates but add revenue by selling the 30th frames as subliminal advertising. In pre-concluded studies like yours, these adds have been shown to be 223% more effective than 42"x42" glossy adverts.



I had a good laugh at that one. Thanks bro.
November 28, 2007 4:10:51 AM

Wow. I guess I rubbed some people the wrong way. Did none of you consider that I might already know about $1,200.00 video cards for 3D modeling? Actually... FYI they are not required for the apps. that I use. The statement about image quality being everything is tied directly to my wanting the best image quality out of every app. I run. Including video games. Who uses a FireGL to run any games? If they were to be considered for gaming, I would imagine we'd have some new benchmarks to post... 'eh? You guys are all being rather mean, don't you think? I caught the joke too - SpinachEater. Not adverts. Motivation. Interest. It worked.

*TheGreatGrapeApe... you have no idea how pretentious you are. I can't believe that someone who I have looked to for advice for sooo long has just turned on me. I use ArchiCAD. It has a constant 3D mode that I switch to from time to time and it can switch from OpenGL to any of 8 different rendering engines. All of these are effected by AA and AF and use multiple light sources w/real-time lighting/shading. It has a video game mode where you can walk/fly through any part of anything you model. It is very versatile in this and many other aspects. I have noticed that from nV. to ATi, the differences are in the color depth and richness, AA perf. and overall speed. ATi has proven to be the best in every way(in this department). AGAIN... I have a goal of proving one way or another, the 2 queries I mention in the 1st post. It has nothing to do with what I do for $. I just want to know. Others want to know. I think the masses have the right to know if all these allegations are really even worth considering when buying a video card. I guess grape ape has solved the problem though.
November 28, 2007 4:23:11 AM

***NO MORE REWARD***
My findings: 1) - ATi wins in the I.Q. department.
2) - nVidia would never cheat to get higher FPS. They are just plain faster.

BTW... the "UVD" hardware accelerated HD playback can only be achieved through ATi "right now" Or am I reading Tom's and every other review upside down? What's gonna change? nV. will (soon I hope) have a version of this. That's what.

I really think I am done here.
a b U Graphics card
November 28, 2007 5:17:53 AM

nevasumma said:

*TheGreatGrapeApe... you have no idea how pretentious you are. I can't believe that someone who I have looked to for advice for sooo long has just turned on me.


Perhaps so, I just have no patience for the umptienth 'IQ' discussion in the past few weeks. I answer the questions in the manner I see fit at the time of my reading unfiltered. Sometimes I think these thread are more troll-ish than real fact finding, and I respond as such. So be it.

Quote:
I use ArchiCAD. It has a constant 3D mode that I switch to from time to time and it can switch from OpenGL to any of 8 different rendering engines. All of these are effected by AA and AF and use multiple light sources w/real-time lighting/shading.


Well I've never used ArchiCAD not much literature I could find, not really sure about it's capabilities (if it were the summer I might be tempted to try-out the academic version), so the subtle differences may favour ATi or nV. From experience though those differences would be most accentuated in the workstation drivers which open up additional levels of AA and other such features in apps that support them. It would be worth checking with Graphisoft for more information if it's a concern, because unfortunately their compatability list ends with the GF7/X1K generation of FireGL and Quadro cards. IMO that's going to be your biggest area of differentiation not games.

Quote:
I have a goal of proving one way or another, the 2 queries I mention in the 1st post. It has nothing to do with what I do for $. I just want to know. Others want to know. I think the masses have the right to know if all these allegations are really even worth considering when buying a video card.


Nowadays there's not much difference in the gaming end, in fact I'd say that the biggest differences still remain in the workstation area, where features and optimizations are very vendor and product specific. There used to be the colour-depth difference, but nV improved their position with the G80 series (which also improved their AF quality and AA efficiency over their previous generation), nV used to have the sub pixel precision advantage which was equaled with the addition of the R600 which also has 12bit precision.

So I don't think you'll ever get a straight answer, especially since most people who should look into this are also fed up with the discussion due to nVidia, AMD and intel's lawyers, and their respective fanboi bases. Look at what EliteBastard's comments stirred up, and then look at the reaction of the usual suspects, and you get the distinct impression, no one cares enough to really do a good job of looking anymore. In fact even though [H] does image test in their reviews, they don't take their own information to heart and still test apples to oranges tests where they already commented on the IQ/AA quality of X being better than Y under condition A but equal under condition B, but for their testing later on they use condition B because they don't think the end user cares as much about it as they do fps.

Unless there's glowing issues, like AF shimmering, the AA blurring, both of which are a thing of the past, your won't get enough of a difference for anyone to chose a card based on that alone. I mean would you chose an HD3850 over a GF8800GT/Ultra because of the minor issues shown by VR-zone?
November 28, 2007 5:31:03 AM

1766024,30,293757 said:
***NO MORE REWARD***

YOU JERK I POSTED SCREEN SHOTS PROVING WHAT YOU WANTED TO SEE!!! I DEMAND YOU PAY UP.
a b U Graphics card
November 28, 2007 6:08:05 AM

nevasumma said:
***NO MORE REWARD***


OK not that I think anyone cared about it, or that it seemed easy for anyone to collect, but offering it and then taking it away is questionable at best.

Quote:
My findings: 1) - ATi wins in the I.Q. department.
2) - nVidia would never cheat to get higher FPS. They are just plain faster.


I think both are pretty questionable findings; ATi has some wins and some loses, nothing major enough to declare a 'win' IMNSHO, and saying nV would never cheat seems a little rose-coloured glasses, as I think both would cheat if it was worth enough to them, howver they would of course do it in a way their lawyers could call 'not cheating', hence why I use my own term, Floptimized.

Quote:
BTW... the "UVD" hardware accelerated HD playback can only be achieved through ATi "right now" Or am I reading Tom's and every other review upside down? What's gonna change? nV. will (soon I hope) have a version of this.


UVD helps, but it's not the only thing required for HD playback which has technically been available since at least the R300/NV40 days. There are only minor differences between ATi's UVD and nV's PureVideo engine, and that's mainly dedicated hardware bitstream decoding for VC-1, however this can still be done by the host/system, and thus in relationship to your original question/statements, it would not be an IQ difference but a performance difference. The issue with the HQV floptimization was something done to boost a score at the expense of quality, but the fundamental differences between the hardware won't change until next generation, but that's not the same as saying that nV doesn't offer HD playback, they do, just differently and with VC-1 titlesw maybe a little less efficiently, but with a good CPU even an old R9600 or S3 graphics card can playback alot of HD content without much trouble. The main benifit is for the high bitrate H.264 stuff that chokes even the fastest CPUs, but it still wouldn't be an IQ issue, just and FPS style issue, where a stuttery image would be like playing a game at 11 fps.
November 28, 2007 7:41:55 AM

skittle said:
1766024,30,293757 said:
***NO MORE REWARD***

YOU JERK I POSTED SCREEN SHOTS PROVING WHAT YOU WANTED TO SEE!!! I DEMAND YOU PAY UP.
said:


Your having a laugh right? a capture of farcry and a capture of crysis is not proof. If your not joking, then you have a hot steaming turd for a brain.
November 28, 2007 7:43:02 AM

nevasumma said:
Wow. I guess I rubbed some people the wrong way. Did none of you consider that I might already know about $1,200.00 video cards for 3D modeling? Actually... FYI they are not required for the apps. that I use. The statement about image quality being everything is tied directly to my wanting the best image quality out of every app. I run. Including video games. Who uses a FireGL to run any games? If they were to be considered for gaming, I would imagine we'd have some new benchmarks to post... 'eh? You guys are all being rather mean, don't you think? I caught the joke too - SpinachEater. Not adverts. Motivation. Interest. It worked.

*TheGreatGrapeApe... you have no idea how pretentious you are. I can't believe that someone who I have looked to for advice for sooo long has just turned on me. I use ArchiCAD. It has a constant 3D mode that I switch to from time to time and it can switch from OpenGL to any of 8 different rendering engines. All of these are effected by AA and AF and use multiple light sources w/real-time lighting/shading. It has a video game mode where you can walk/fly through any part of anything you model. It is very versatile in this and many other aspects. I have noticed that from nV. to ATi, the differences are in the color depth and richness, AA perf. and overall speed. ATi has proven to be the best in every way(in this department). AGAIN... I have a goal of proving one way or another, the 2 queries I mention in the 1st post. It has nothing to do with what I do for $. I just want to know. Others want to know. I think the masses have the right to know if all these allegations are really even worth considering when buying a video card. I guess grape ape has solved the problem though.


Your right it was a bit pretentious. His handbag was swinging wildly there :lol: 
November 28, 2007 10:29:53 AM

Thanks for liking the humour SpinachEater(Popeye?). From the get go this thread deserved nothing more which skittle accurately determined also. The OP started with a conclusion and then asks that the members of this forum do his googling to prove it. The real cheese on this was the offer of compensation which I feel is at odds with the spirit of a community forum. In the end though the OP has drawn his own conclusion and withdrawn his offer which is probably worth less than the shipping he would have asked the recipient to front anyhow.


I'm still waiting for his vunderlinks where the 3870 beats the gtx and ultra by leaps and bounds at 8x16x.

Ape may be abrasive but on matters of fact he is usually dead on compared to the OP who would rather use this forum to perpetuate his own unsupported opinion.

Sorry Nevasumma but this thread whether you realize it or not was nothing but troll bait.

November 28, 2007 1:35:45 PM

I think we all just need to watch an episode of Old Greg. His down stairs mixup, drinking baileys out of a shoe, and going to bars where men pee on eachother makes everything come into focus.
November 28, 2007 3:08:28 PM

The answer is YES it does skip some frames that makes it unreasonably fast. One reason why it is not as flawless or as good picture quality than ATI.

nVidia skips some frames since GeForce 4 as I have known. You will see an example in Crysis where 8800GT has 30FPS at high resolution while ATI has 20FPS but the consequence is it is not as flawless as the HD3870. But when they use the patch for crysis released by nVidia the FPS drops to 20-24.

GPU Radeon HD 3870
Core clock 775MHz
Stream Processors 320 Unified Stream Processors
Memory
Memory Clock 2400MHz
Memory Size 512MB
Memory Interface 256-bit
Memory Type GDDR4
---------------------------------------------
Chipset Manufacturer NVIDIA
GPU GeForce 8800GT
Core clock 600MHz
Stream Processors 112
Memory
Memory Clock 1800MHz
Memory Size 512MB
Memory Interface 256-bit
Memory Type GDDR3

Take alook on PowerColor's HD3850 Xtreme 512MB
http://itembargain.com/forumn/index.php?topic=38.msg86#...
November 28, 2007 3:56:28 PM

Why all of the hand-wringing? Buy both, compare, make your own decision. Why do you care what someone else thinks about image quality? It's your eyes that make the final judgment on whose image quality is better. Then sell what you don't want to keep.
November 28, 2007 4:11:53 PM

I am sorry for starting this thread.
I only withdrew the "prize" because I felt that it was not going to be proven, professionally, that either allegation is true OR NOT! I will pay for shipping, that would be wrong to make you pay for it. Shoot what's a few cents. GrapeApe has given me more in the way of what I was asking for. I had already read those reviews though. I still do not see actual answers to my questions. With proof supporting them. The fact is... there is no provable answer one way or the other. I was not looking to start a troll baiting session. I am done. If anyone thinks they deserve a photo, they can have one. NOT skittle. Seriously man... ALSO, I was being sarcastic with my "findings". Don't take "my personal opinion" as technical proof of anything. The findings are my own. Not based on what anyone has provided. A joke really.

I did my own "Googling", and found nothing. That's why I asked for people to prove their "allegations". I thought it would be fun to offer incentive. I was wrong. I did not mean to "go against the spirit of the forum". In my opinion, grapeape and mactronix are able to claim. No one has addressed the queries w/ any seriousness. So I feel stupid. AND very hollow at the moment. It truly does not matter who is better. GrapeApe...I'm sorry. Everyone who is reading this right now... I'M SORRY. I'm done. Don't bother to try and find anything for me.
November 28, 2007 4:16:49 PM

nevasumma said:
OH... No matter what the results turn up, I want the truth.



The truth ..... nvidia cannot give the pure color and quality with their cards ..... never did and never will . Lousey video quality to achieve micro framerate scores . Face it ..... nvidia graphics quality is way below .... they are extremely INFERIOR in that respect . If you don't mind having crappy looking video effects get your 8800 .... it'll suck .... you'll get a few more framerates but it'll look like SH*T compared to an ATI card setup . And what's with this multiple card problem arising with 8800's ????? None whatsoever with ATI !!!! HAHAHAHA ---- :kaola: 
November 28, 2007 4:24:49 PM

*** "I'm still waiting for his vunderlinks where the 3870 beats the gtx and ultra by leaps and bounds at 8x16x." ***

I found them through other review sites, forumz etc. It was very late and I should have remembered who and where they are. I did not ,at the time. I have been spending way too much time trying to find them again. In all of my time wasting... I have discovered that No one claims that nV. has better I.Q. yet many claim that ATi does have better I.Q.
YES it IS personal opinion, in the end. If anyone cares to see the benchmarks of the cards at their best, ask your trusted review sites why they don't post those results. 8x AA w/16xAF and 1600x(+) resolutions. The numbers are very different than what you've seen in the sites that cut it off at 4x. For some reason. AND if no-one cares about anti-aliasing then why are there even behchies w/AA enabled? If nobody wanted to run at 8xAA then why do the cards do it? If you could, would you? I would. If anything just to see what 8xAA looks like, if it is even worth using over say 2xAA at 1600x res. or higher.
November 28, 2007 4:57:28 PM

nevasumma said:
I found them through other review sites, forumz etc. It was very late and I should have remembered who and where they are. I did not ,at the time. I have been spending way too much time trying to find them again.


Did you know that your browser has a history button!!!!
November 28, 2007 5:20:59 PM

The whole point is summed up in troopers post. I hear it all the time. I have never heard it being the other way around.
I have learned something here.
There are only opinions regarding these matters.
I hope someone somewhere will find relevance in all of this.

That is what this whole forum is about isn't it? - Help me with my build... LOL.
November 28, 2007 5:28:27 PM

Did you know that I use FireFox and have it dump all of my history/cookies etc. EVERY TIME I close it??? On purpose. I know it would be helpful in this circumstance, but I don't keep my histories and cookies since they are the types of things that get found by Spybot and removed. I am internet paranoid. I'm sure my friends would agree. ...wink
Skittle - it sounds like you want a photo, I will send you one. If you like.
November 28, 2007 5:54:47 PM

Yes there is a difference in color from ATI to Nvidia. Duh!! The funny thing about this thread is we are debating if Nvidia is cheating. Personaly I dont care. The human eye/brain does not have the ability to register these cheats being spoke of. Unless it can bee seen by the user then it does not matter.
November 28, 2007 6:15:51 PM

Imagine that I know that. And that I feel like I would be getting bamboozled out of $100.00 or more for a difference in frame rates that I could not even detect, coupled w/supposed "inferior image quality".
November 28, 2007 6:30:48 PM

I used to be an nV. fan-boy. I felt like because ATi was taking technology that nV. invented and trying to just make it happen faster, was ridiculous. Until I read some ATi spec. sheets from the original ATi site. I fell into an ATi card and was impressed. I actually had two for comparison. Now it seems that especially since they merged w/my fav. CPU man. the architecture can speak for it's self. I really wanted nV, to come out with some new kind of tech. that ATi couldn't touch, to keep my faith in them exclusively. However, now nVidia is faster.
November 28, 2007 7:16:29 PM

nevasumma said:
I am sorry for starting this thread.
I only withdrew the "prize" because I felt that it was not going to be proven, professionally, that either allegation is true OR NOT! I will pay for shipping, that would be wrong to make you pay for it. Shoot what's a few cents. GrapeApe has given me more in the way of what I was asking for. I had already read those reviews though. I still do not see actual answers to my questions. With proof supporting them. The fact is... there is no provable answer one way or the other. I was not looking to start a troll baiting session. I am done. If anyone thinks they deserve a photo, they can have one. NOT skittle. Seriously man... ALSO, I was being sarcastic with my "findings". Don't take "my personal opinion" as technical proof of anything. The findings are my own. Not based on what anyone has provided. A joke really.

I did my own "Googling", and found nothing. That's why I asked for people to prove their "allegations". I thought it would be fun to offer incentive. I was wrong. I did not mean to "go against the spirit of the forum". In my opinion, grapeape and mactronix are able to claim. No one has addressed the queries w/ any seriousness. So I feel stupid. AND very hollow at the moment. It truly does not matter who is better. GrapeApe...I'm sorry. Everyone who is reading this right now... I'M SORRY. I'm done. Don't bother to try and find anything for me.


Sorry to burst your bubble...again, but Nivida as been caught several times and PROVEN. The problem is most of those links are so damn old, not easy to find them anymore.
http://www.geek.com/is-nvidia-cheating-on-benchmarks/
http://www.dailytech.com/AMD+Alleges+NVIDIA+Cheats+in+H...
http://www.extremetech.com/article2/0,3973,1086025,00.a...
http://www.extremetech.com/article2/0,3973,1201076,00.a...

I still cannot find some of the originals (not online anymore, or really hard to find). It wasn't skipping frames, it's skipping details, scans, lighting issues (like calucate only one light source out of multiple sources) and other "shortcuts" to get the video card to run faster. The links just show the tip of the iceburg. For those of us that as been here for 10-20 years, know about the nivdia cheats.
November 28, 2007 7:52:35 PM

nevasumma said:
Skittle - it sounds like you want a photo, I will send you one. If you like.


Not really, I just enjoy pointless threads like this every now and then.
!