Sign in with
Sign up | Sign in
Your question

GeForce 8800: Here Comes the DX10 Boom

Last response: in Graphics & Displays
Share
November 8, 2006 9:03:30 PM

Nvidia has been working with DX10 for as long as Microsoft has been developing the standard. Today, what we get is G80, otherwise known as GeForce 8800GTX. Unified DX10 shaders never looked better!

More about : geforce 8800 dx10 boom

November 8, 2006 9:56:04 PM

This thing really looks amazing. As soon as the price drops a bit (I'll be ready for a new build then) I'm going to pick one up! (Unless ATI's offering turns out to be better)
November 8, 2006 9:59:16 PM

I wonder if CUDA spells death to physics processing units such as AGEIA PhysX
Related resources
Can't find your answer ? Ask !
November 8, 2006 10:20:03 PM

Quote:
I wonder if CUDA spells death to physics processing units such as AGEIA PhysX

Very possible, being as how the gaming community "buy-in" on the PhysX seems to be pretty low anyway. From what I've seen, it looks as if physics in DX10 games (such as Crysis) are more advanced than what the PhysX card has to offer anyway.
November 8, 2006 10:28:29 PM

One thing I don't agree with in that article. It's a personal opinion, but I honestly can't say the 8800GTX has better image quality than the X1950XTX based on the supplied Oblivion screenshots. The author writes:

Quote:
This image is far superior in quality compared to that of the ATI. It looks like the tables have turned in that department.


If anything, ATi still looks better to me, mostly because I like the contrast better with the ATi screen.
November 8, 2006 10:43:27 PM

Quote:
GeForce 8800GTX runs at 1,350 GHz and the 8800GTS' clock speed is 1,200 GHz.


if only i could belive that...
November 8, 2006 10:59:32 PM

Nice card but really how many people play at resolutions above 1280X1024. Anything above that is gravy. But it is a nice card maybe I will buy one in about a year. :wink:

BTW I can't tell the difference between the Oblivion screenshots of the ATI and Nvidia card. Maybe I need to get my eyes checked?

They sure do like to tease us with nice screen shots but when are all of these games going to come out? 2008?
Anonymous
a b U Graphics card
November 8, 2006 11:02:06 PM

euh, anyone with a LCD screan bigger then 19" and a lot of people with a CRT of 19"+.
November 8, 2006 11:26:47 PM

Yes true but I would say a larger screen/higher resolution is in the minority while 1280X1024 and 1024X768 are the common resolutions that people use. That will probably change in a year or two when larger LCD monitors drop in price.
a b U Graphics card
November 8, 2006 11:55:22 PM

Check out the demo of DirectX 10 on the Nvidia's site.
It's soo realistic!

demo
November 9, 2006 1:13:39 AM

nice great article.
anyone knows when will 8600gt come out? I paid 165$ for my opteron 165, there is no way I'm going to pay more than $200 for a graphic card no matter how good it's going to play with games.
I think there are a lot of people with the same standing as me. Let's wait and see how 8600gt turns out.
November 9, 2006 1:41:06 AM

If you want to wait until March or later.
November 9, 2006 1:53:54 AM

Quote:
nice great article.
anyone knows when will 8600gt come out? I paid 165$ for my opteron 165, there is no way I'm going to pay more than $200 for a graphic card no matter how good it's going to play with games.
I think there are a lot of people with the same standing as me. Let's wait and see how 8600gt turns out.


Sounds good to me I think the low end of DX10 cards will be fine for a while untill the software takes advantage in something like 1 year hmmm maybe a little more if its harder to program for or less if its easyer ? (I know I know but dont post those Master Of The..... pictures lol) Believe it or not im still happy with my X1300 *shrugs* These newer games are sure to make me want for more though eventualy. I usualy dont upgrade my graphics card untill I run into something I cant run right lol
November 9, 2006 2:06:59 AM

"Were Nvidia leads, Ati will follow" 8)
November 9, 2006 2:10:35 AM

I wrote this to an individual that sent me an email earlier today... perhaps this clears up anything I said about the image quality.

"It is tough to get the impression from a screenshot. The true quality is in the interaction in the game. I love that scene in Oblivion because of all that is going on. There are trees, flowing grasses and flowers, mountains in the background, and a strong mixture of town to the left and open air to the left. Where the quality can be seen the most is when you leave the scene alone for 2 minutes when the camera starts to circle the character. When this happens you can see the shadows of the character and horse on the pixel shader generated grasses below. The shadows on Nvidia 7000 series hardware would dance all over the place and actually could make you motion sick on a large resolution like 2560x1600. ATI has always looked better in real-time in this game (Quad, traditional SLI, or single card). When I plugged in the reference 8800GTX for my first view of 1024x768 (all the settings up) I thought to myself, "Wow, they really changed their image quality." I immediately ran the X1950XTX to see how it faired. I can say I am impressed with the quality improvements over the 7000 and 6000 series cards.

JPEG files are terrible for "proving" image quality. The place you can see it is in real-time with all of the objects in motion. There is where you see if the frames move smooth and whether they are aliased (jagged) or not.

If you have any suggestions of comments on this, please feel free to follow up on this."

I really like what Nvidia has done with GeForce 8800 and in terms of image quality, Nvidia has clean their act up.
a b U Graphics card
a b Î Nvidia
November 9, 2006 4:10:27 AM

Quote:

JPEG files are terrible for "proving" image quality. The place you can see it is in real-time with all of the objects in motion. There is where you see if the frames move smooth and whether they are aliased (jagged) or not.


Could always use PNG files, but it depends on what you're trying to show. For quality of AF a static shot should be good enough, for the ability to smooth the transistions with AF enabled would require moving images of course (same as to show shimmering, crawl, etc on the GF7).

And while I agree that the image quality of the GF8800 is much better than the GF7/6 series cards, I wouldn't say the difference between the GF8 and X19 is anywhere near as obvious from anyones stills, and the benifit of the new AF should translate better. The only thing I notice was [H]'s review showed a slight chainlink improvement in HL2, but their image also had worse railroad ties IMO. No doubt the nV is superior or equal at worst, but I wonder if we're getting closer to the well ATi is 99% there, and nV is now 99.44% perfect, whereas the GF7 was in the 80s.

Like I mentioned before the possibility for high IQ is one thing (as seen in the FX series),but the actual implementation is something else.

No doubt though, better or equal AF, better or equal AA, plus better performance at higher resolutions, all are pluses, and really makes it a nice change of pace especially now that they also have the FP16+HDR w/ AA option in hardware.

Pretty much positive all around, the only reservation that gets me is that everyone was all happy about the IQ of the G70 until the R520 came out, and then people said, OH! That's what's missing.

Still wanna see more tests, but everything sofar is pretty dang good!
November 9, 2006 4:30:50 AM

Your darn tootin its good.

King of IQ and King of Performance. All your base are belong to us.

I ordered a Evga 8800GTX today with 2 day shipping. Might get here Friday but thinking probably Tuesday.

With the worlds greatest card out, and another exceptional Nvidia hard launch.. I had to throw down for this one.

First time I've spent over $450 for a GPU. First time its ever really been worth it.
I didnt buy into the 7950GX2 craze, the QuadSLI um... 'craze', or even SLI anymore.. but I can say I've owned SLI (and at SLI launch to boot). No comment on Crossfire.. I def wasnt dumb enough to buy into that knee-jerk reaction multiGPU implementation from ati.
I really love how it cant properly render 2560x1600 in Oblivion properly still, today. Either way, its now slow and a thing of the past.
But my SLI days are probably over. One 8800GTX is enough for me till G81.. then the 8900GTX will be enough till the G90s.

I salute Nvidia and salute Jen-Hsun Huang for an exceptional product execution. This is what I'm talking about when I talk about Nvidia. First class. In ya face like KaPow! :D 

In the words of Mr. Duke Nukem himself... Hail to the king baby :twisted:

I'd like to add comments from a poster at OCP-
Quote:
Way above expectations. Right up there with fond hopes, and a dash of spicy mustard on top! Not "just as good as ATi's HQ AF," but better! Not just HDR+AA, but new totally rockin' AA modes. A card with no asterisks, "Well it's good except for...."


And future potential yet unknown. No "better drivers may get it up to what we expected"--instead, "better drivers will make this sick pig even sicker!"


QFT!!!! :!:

Oh, and check this out... you'll laugh. :) 
Link
November 9, 2006 4:40:23 AM

Quote:
Pretty much positive all around, the only reservation that gets me is that everyone was all happy about the IQ of the G70 until the R520 came out, and then people said, OH! That's what's missing.


Oh, and this is because ATI wasnt ever the definitive IQ king... as G80 is today.
In the days of yore- NV typically had better AA and ATI had better AF.

Most people went out on a limb and said the visual difference between ATI AA and NV AA wasnt nearly the large difference between ATI AF and NV AF... thus, ATI was crowned "IQ king" (by some). But this was no clear cut debate. Esp when many sites who dissected ATI and NV drivers and IQ recognized that Nvidia drivers actually did more work on a scene than ATI.

This is all ancient history now, so hardly worth drudging up. But that was what came to mind when I read about your reservation.

I will admit, at first I had the same "reservation" but the answer is quite clear to me- there never was a definitive all-out hands-down IQ King, before G80.
So yes, people didnt say "oh thats what we were missing!" before because everyone knew what NV was missing.. but it was hard to fault them to the point of not using Nvidia because they had their IQ advantages as well.
a b U Graphics card
a b Î Nvidia
November 9, 2006 5:41:11 AM

Quote:
Pretty much positive all around, the only reservation that gets me is that everyone was all happy about the IQ of the G70 until the R520 came out, and then people said, OH! That's what's missing.


Oh, and this is because ATI wasnt ever the definitive IQ king... as G80 is today.
In the days of yore- NV typically had better AA and ATI had better AF.

Very selective memory you have there, when the GF7 came out the X850 still had better standard AA, and when the X1800 came out it continued the better AA with the added levels. The only time nV even had an edge was with the addoption of Transparency AA, and then that got negated too. nV's SSAA was the only thing comparable, but at a heavy performance cost, so it was never worth enabling versus upping the resolution.

Quote:
Most people went out on a limb and said the visual difference between ATI AA and NV AA wasnt nearly the large difference between ATI AF and NV AF... thus, ATI was crowned "IQ king" (by some). But this was no clear cut debate.


Wan't even a question of going out on a limb since the difference was between useable feature HQAF and unuseable feature SSAA.

Quote:
Esp when many sites who dissected ATI and NV drivers and IQ recognized that Nvidia drivers actually did more work on a scene than ATI.


Many sites? Like who Anand? Doesn't matter how much 'more work it does' if the results don't match that work.

Quote:
This is all ancient history now, so hardly worth drudging up. But that was what came to mind when I read about your reservation.


If that's what comes to mind then you really need to reconsider what you remember from the era. The other thing to remember about GF7 AA was no FP16 HDR + AA in hardware, so really what is it I'm missing about their being deserved reservation about what we don't know yet (like DX10 features, support, etc).

Quote:
I will admit, at first I had the same "reservation" but the answer is quite clear to me- there never was a definitive all-out hands-down IQ King, before G80.


Yeah but you're missing what I'm saying, prior to the release of the X1800, the GF7800 had the same angle-dependant AF as the X800, and similar AA with option for extra. So at the time it too could be considered as the hands-down king of IQ... until the X1800 showed that they still had work to do. The GF80 beats the X1900 no argument there, and definitely the GF7 series, but whether it will keep the title or whether there is still more to be known about feature interaction is far from determined at this point. Like I said my reservation involves other possible issues that rarely get tested by reviewers, and usually wind up in either nVnews' forums (for nV issues) or Rage3D's (for ATi's issues). I suspect the next real test is when DX10 software and other compliantr hardware makes it to market to truely test the new architecture.

Quote:
So yes, people didnt say "oh thats what we were missing!" before because everyone knew what NV was missing..


No they didn't, like I said, nV AF was fine at the time because no one knew that the AF on the R520 would be angle-independant and that it would matter that much with the shimmering. Also either no one knew or no one reported that there was the FP16HDR+AA limitation.

Quote:
but it was hard to fault them to the point of not using Nvidia because they had their IQ advantages as well.


It wasn't about not using nV, it's about the current hyperbole about how wicked everything is (including yours about perfect AF [BTW you didn't notice that the AF still shows signs of feathering?]), and not the potential limiting factors.
This is not specific to any one card maker either, it's just about reviews in general, most are more extensions of ADs for the IHVs, not investigations into the actual hardware and features,functions,limitations. That's why still prefer B3D reviews because it's more about the features/functions than just FPS.

Anywho the G80 is undeniably the current leader of IQ, I'd like to see more tests for certain features (especially 2D video playback), but the true test will be once the intended DX10 features and hardware start getting full once overs to compare. That's why I'd hold off calling anything perfect.
November 9, 2006 6:00:42 AM

Oh boy. The long drawn out battle posts with Ape. Sometimes I think you just use the tactic of wearing someone out! LOL



Quote:
Pretty much positive all around, the only reservation that gets me is that everyone was all happy about the IQ of the G70 until the R520 came out, and then people said, OH! That's what's missing.


Oh, and this is because ATI wasnt ever the definitive IQ king... as G80 is today.
In the days of yore- NV typically had better AA and ATI had better AF.

Very selective memory you have there, when the GF7 came out the X850 still had better standard AA, and when the X1800 came out it continued the better AA with the added levels. The only time nV even had an edge was with the addoption of Transparency AA, and then that got negated too. nV's SSAA was the only thing comparable, but at a heavy performance cost, so it was never worth enabling versus upping the resolution.

I was referring to GF7 vs X1900. I'm not going to go over ancient history with you. Though you know I'm more than capable of doing so.. whats the point?

All this conversation you posted above doesnt mean anything.
In the end, NV had better AA... ATI had better AF.

Quote:
Most people went out on a limb and said the visual difference between ATI AA and NV AA wasnt nearly the large difference between ATI AF and NV AF... thus, ATI was crowned "IQ king" (by some). But this was no clear cut debate.


Wan't even a question of going out on a limb since the difference was between useable feature HQAF and unuseable feature SSAA.

I used 8xS. It was very playable in certain games and resolutions.. also with SLI if one had it.
Its a limb. It broke and you fell (apparantly on your head).

Quote:
Esp when many sites who dissected ATI and NV drivers and IQ recognized that Nvidia drivers actually did more work on a scene than ATI.


Many sites? Like who Anand? Doesn't matter how much 'more work it does' if the results don't match that work.

The results did match.. there was never a clear cut winner in the GF7/X19 race for IQ. I'm really sorry. :roll:

Quote:
This is all ancient history now, so hardly worth drudging up. But that was what came to mind when I read about your reservation.


If that's what comes to mind then you really need to reconsider what you remember from the era. The other thing to remember about GF7 AA was no FP16 HDR + AA in hardware, so really what is it I'm missing about their being deserved reservation about what we don't know yet (like DX10 features, support, etc).

Quote:
I will admit, at first I had the same "reservation" but the answer is quite clear to me- there never was a definitive all-out hands-down IQ King, before G80.


Yeah but you're missing what I'm saying, prior to the release of the X1800, the GF7800 had the same angle-dependant AF as the X800, and similar AA with option for extra. So at the time it too could be considered as the hands-down king of IQ... until the X1800 showed that they still had work to do. The GF80 beats the X1900 no argument there, and definitely the GF7 series, but whether it will keep the title or whether there is still more to be known about feature interaction is far from determined at this point. Like I said my reservation involves other possible issues that rarely get tested by reviewers, and usually wind up in either nVnews' forums (for nV issues) or Rage3D's (for ATi's issues). I suspect the next real test is when DX10 software and other compliantr hardware makes it to market to truely test the new architecture.

To sum a long story short. There really hasnt been a clear cut IQ champ.. at least not as clear cut as this Geforce8.. in the history of GPUs.
Possibly 3dfx vs early ATI (which wasnt to hot). Or R300 vs FX. Even there though, including the FX AF.. there wasnt a head and shoulders winner.

Quote:
So yes, people didnt say "oh thats what we were missing!" before because everyone knew what NV was missing..


No they didn't, like I said, nV AF was fine at the time because no one knew that the AF on the R520 would be angle-independant and that it would matter that much with the shimmering. Also either no one knew or no one reported that there was the FP16HDR+AA limitation.

NV AF was inferior. Thats true, but whats the point. NV AA was still superior.. and shimmering was -not- cured by either the x1900 or GF7 series. If any of the X800/x1800/x1900 had accomplished that, I think people wouldve came to a concensus that ATI was the crowned champ of IQ.
But shimmering still existed on both to an extent. Yes far less on ATI. But still existed, making it less of a celebration for ATI as "victory".

Quote:
but it was hard to fault them to the point of not using Nvidia because they had their IQ advantages as well.


It wasn't about not using nV, it's about the current hyperbole about how wicked everything is (including yours about perfect AF [BTW you didn't notice that the AF still shows signs of feathering?]), and not the potential limiting factors.
This is not specific to any one card maker either, it's just about reviews in general, most are more extensions of ADs for the IHVs, not investigations into the actual hardware and features,functions,limitations. That's why still prefer B3D reviews because it's more about the features/functions than just FPS. [/quote]

Perfect as in, as good as it probably is going to get. I'd be shocked to see a card with better AF and better performance.. It will happen but this is so close to being perfect that its astounding.

Quote:
Anywho the G80 is undeniably the current leader of IQ, I'd like to see more tests for certain features (especially 2D video playback), but the true test will be once the intended DX10 features and hardware start getting full once overs to compare. That's why I'd hold off calling anything perfect.

I think the D3D tester is proof enough in AF domination.
As far as testing everything out, such as video.. I'll be able to do that myself soon enough.

You should get one ;)  Test yourself :D 

But the card is obviously spectacular, but I am interested in seeing what changes/improves with newer driver revisions. As the driver is currently a seperate download from the rest of the sets.. its clear its still cutting edge software development at Nvidia for this card. They will probably work on video later, right now they need to get the most games working well and improve performance as much as they can.
November 9, 2006 7:27:28 AM

Hey Im new here, where are the girls?
November 9, 2006 8:27:17 AM

Ima invest in one of these new cards with my new system, but it needs to be pretty much silent..can i get watercooling for the 8800GTS?
November 9, 2006 9:13:55 AM

wat about power requirement? shudnt d companies involved do smething about reducing power requirement.

Reducing transistor size will not only reduce power req. but also give them better yield which may either give them better profit or reduce price for more sales (good 4 us) it a win win situation 4 all of us.

I am happy wid the graphic solution I hve dat way I m not troubling the environment and still be able to play the oldies. they maynot have realism but wat d heck atleast they have replay value (atleast 4 me)
November 9, 2006 11:36:53 AM

Isn't Oblivion a DX9 game? If so then it won't fully utilize everything the G80 has to offer.
November 9, 2006 12:32:13 PM

Quote:
In the words of Mr. Duke Nukem himself... Hail to the king baby :twisted:


I believe sir, that was Ash AKA Bruce Campbell specifically from Army of Darkness part of the Evil Dead series
November 9, 2006 1:28:35 PM

Quote:
In the words of Mr. Duke Nukem himself... Hail to the king baby :twisted:


I believe sir, that was Ash AKA Bruce Campbell specifically from Army of Darkness part of the Evil Dead series

I love AOD. :D 
November 9, 2006 2:03:00 PM

I have the quote "Hail to the King baby!" tattooed down my arm lol. Most people don't know what it's from and it's an interesting conversation starter :) 
November 9, 2006 2:21:39 PM

Quote:
Ima invest in one of these new cards with my new system, but it needs to be pretty much silent..can i get watercooling for the 8800GTS?


I saw a watercooled card advertized yesterday, for a small pitance of $800.00. So yes, you can get one, and watch your credit card balance get larger, and larger.
November 9, 2006 2:47:38 PM

Dang! For $650, I would have expected much more frame rates. They were better, but marginally. And, not in all cases, older ATI cards on some test still out did the 8800GTX. Amazing! nVidia was spinning the "double the silicon" so much you'd expect almost double the performance. At least they are now on the image quality bandwagon (Finally!) so that's a good thing.

If this is all they can get out of all those pixel shaders, I can't wait to see what ATI has in store for us.
November 9, 2006 2:53:50 PM

Hmmm, the test's at the lower resolutions were CPU limited... or did you fail to read that part? The real benefit lies in the ability to keep all the Af + AA cranked up at 16x12+. Seeing 30FPS + outdoor in Obvlivion is ridiculous, and very nice. if you look at the GPU limited area's 2048x1536, alot of the benchies come close to doubling, but at least 50% in many cases (I am ballparking since I am between classes atm, only 10min hehe).
November 9, 2006 3:01:06 PM

I'm stoked!

This is an awesome card but I always like to see all the cards before I choose to buy anything.
November 9, 2006 3:50:50 PM

The X1950XTX gets absolutly pwned in Oblivion by both the 8800GTX and GTS...

We want price cuts on these, ATI!!
November 9, 2006 4:17:22 PM

Superfly03, I looked at the WHOLE picture, not just one or even a few results. For $650 it should be better, much better. The vast majority of gamers play at 1280x1024, unlike the 1% high end players like you and me. I only consider a good card that can play at a medium res with ALL eye candy turned on. The 8800 finally gains ground in that area, unlike nVidia's past products, which is definately worth a lot. But overall, it just didn't do that good.

Dr_Asik, Have you not checked the prices of the X1950XTX lately? For a card that started at $600 you can get for $400. And, I believe getting "pwned" requires beating all takes in every category. The 8800 didn't do that. It did good. But not everywhere in every way. Sad for nVidia.
November 9, 2006 5:27:17 PM

400$ is not bad for the X1950XTX, but with the 8800GTS agressively starting at 440$ the Radeon might go down even lower.

And I didn't say the G80 utterly pwns the X1950XTX, I said it pwns it in Oblivion, check the link I provided; even the GTS kills it. and that's pure pwnage since the X19xx were until now the undisputed king of Oblivion.
November 9, 2006 6:06:37 PM

well i THOUGHT id get ths 8800gtx, but looks like im wrong...41fps in oblivion on 1024x768 ...thats exactly the same as the 7950 performed...41 fps on lower resolution is not exceptable for a peice of hardware that costs almost 700 dollars. these companies will get it sooner or later im sure, make the power worthy of its high price tag. The other benchmarks mean nothing considering they are LAST generation games, everything i can run max with my 7800 gtx over 60fps at max/high settings. hype for nothing.
November 9, 2006 6:10:40 PM

At $600 you are disappointed? Come on. Of you calculate the $/FPS for the 1950XTX, 8800GTS and 8800GTX, the best values come in at 1950XTX and 8800GTS and they split it about 50/50. Then take the $/FPS you calculate for the 1950XTX and project the price based on the 8800GTX performance and average those projected prices, it comes about to ~$550. If you include the DX10 capability and all the other features that are included in the new 8 series, then you realize that $600 isn't that bad. You are right that the 1950XTX may be a better value, but not by much. The projected price if you are a low res gamer is ~$465 (1024 and 1280), high res avg price comes in at ~$590 (1600 and 2048) extreme res comes in at ~$619 (2560 only). also these numbers will change based on the system configuration and the CPU, because if CPU begins to limit frames then the value of the 8800GTX will fall.

Also as you stated the 1950XTX started at $600 and now has dropped $200... expect the same to happen with the 8800GTX. I would expect to see it in the 5XX range within two-three months depending on availability.
November 9, 2006 6:26:29 PM

I'm just worried that there wont be enough shipping and we will have to wait forever to get our hands on them if we arent willing to buy right away.

Im sure you remember that happening really badly with the x800s, and it took forever for it to recover. Prices didnt go down on cards for a long time because of that.
November 9, 2006 6:27:26 PM

Is it suprising to anyone else that these cards use GDDR3 instead of 4 (which the X1950XTX uses). Anyone have any guesses as to why this was done or if they would expect improvements using GDDR4 in the near future.
November 9, 2006 6:38:48 PM

Because it is more expensive and there is little if any discernable difference in actual performance.
November 9, 2006 6:46:56 PM

If the 8 series architecture holds true to how the 7 series was, then there is no need for the GDDR4 because the architecture is core bottlenecked. I realize that the 8 series is designed entirely different from the 7 series, but it has held true over the 6 and 7 series. I tested the benefits from increasing the memory vs increasing the core clock in 3DMark and a few games, and 1MHz on the core is worth more than 1MHz on the memory. In fact, OC'ing the RAM only on a 7900GTX gets you very very few FPS, maybe 1-2 and thats it it doesn't scale, but if you OC the core you get 4-5, maybe more depending on the setup and context and it scales.
a b U Graphics card
a b Î Nvidia
November 9, 2006 7:18:01 PM

Quote:
Oh boy. The long drawn out battle posts with Ape. Sometimes I think you just use the tactic of wearing someone out! LOL


But they're facts, and each one is in reply to your statements, which get the history wrong, you can go over it if you want, but the facts remain as stated by me. Since your reply was to my statement about the GF7 coming out and being on top of the IQ podium, then losing that spot, then everything I said applied now too. Same issue, same concern.

You can make it personal if you want, but at no point did I comment on you bonking your head. Try to keep within the lines of the hardware, eh! :tongue:


Quote:
I was referring to GF7 vs X1900.


I wasn't and you replied to my statement. IQ went X800->GF7->X1K exactly like I said.

Quote:
All this conversation you posted above doesnt mean anything.
In the end, NV had better AA... ATI had better AF.


Had better AA until the X1800 came along.

Quote:
I used 8xS. It was very playable in certain games and resolutions..


And that was my point if you look back, SSAA was nice, but the performance penalty was worse than going up another resolution notch with plain MSAA.So how is that a useable benifit? The penalty of HQAF was noticeable, but nowhere near the same as SSAA and the difference was more noticeable especially in titles like Oblivion. And with SSAA available on the X1K series then they became equal, so like I said GF7 took the crown, then ATi took it back. Now the GF8 has it, seems pretty straight forward.

Quote:
The results did match.. there was never a clear cut winner in the GF7/X19 race for IQ. I'm really sorry. :roll:


Only to somone who didn't care aout IQ, like you've always stated. To everyone else it was clear, ATi had better AF, ATi had better AA for the same values and had additional options, like Temporal AA.

Quote:
To sum a long story short. There really hasnt been a clear cut IQ champ.. at least not as clear cut as this Geforce8.. in the history of GPUs. Possibly 3dfx vs early ATI (which wasnt to hot).


Nah Matrox beat them both, even in gaming for IQ it was FPS they had. Even when the Parhelia came out it had the IQ, but not the speed.

Quote:
Or R300 vs FX. Even there though, including the FX AF.. there wasnt a head and shoulders winner.


Well there wasn't in theoretical benchmarks, but in actual gamep,ay the R3xx series lead the whole way.

Quote:
NV AF was inferior. Thats true, but whats the point. NV AA was still superior..


Until it was bested. You show me a single review that pits the GF7 as being head and shoulders above now. You now have the GF8 to sooth your pride, but you're defending the undefendable just like the FX, which is pointless.

Quote:
and shimmering was -not- cured by either the x1900 or GF7 series.


Shimmering was not an issue on the ATi cards when HQAF was enabled, it even cleared up when going from default to quality, but to make matters worse for the GF7 nV kept the drivers floptimizations from bringing their 'quality' setting in line with ATi's similar 'quality' setting, so it was even worse, and then using HQ there wasn't shimmering.

Quote:
If any of the X800/x1800/x1900 had accomplished that, I think people wouldve came to a concensus that ATI was the crowned champ of IQ.


You show me the nV shimmering issue on ATi cards with HQAF enabled. Even Anand won't try to say that.

Quote:
But shimmering still existed on both to an extent. Yes far less on ATI. But still existed, making it less of a celebration for ATI as "victory".


Seriously show me a link, that supports that.

Quote:
but it was hard to fault them to the point of not using Nvidia because they had their IQ advantages as well.


Quote:
Perfect as in, as good as it probably is going to get. I'd be shocked to see a card with better AF and better performance.. It will happen but this is so close to being perfect that its astounding.


It is impressive, but it's far from perfect, but it's the best we have sofar, but like I said the difference between G80AF and X1K AF is less than the difference between X1K AF and GF7 AF; and anything that would beat the G80 will be even less of a difference, because there's not much room left to improve, but it's obvious that it's not perfect.

Quote:
I think the D3D tester is proof enough in AF domination.


Except in a game it's exrtremely minor difference, look at [H]'s review to see how little the difference is.

Quote:
As far as testing everything out, such as video.. I'll be able to do that myself soon enough.


You'd need something to test against head to head, and you don't even have the GF7 anymore for that. BTW, I still wouldn't consider you an unbiased observer.

Quote:
You should get one ;)  Test yourself :D 


Nah, unless they make a PCMCIA version ain't gonna happen, if GW and everyone else can't get me to go back to desktop the G80 isn't about to do it.

Quote:
But the card is obviously spectacular, but I am interested in seeing what changes/improves with newer driver revisions. As the driver is currently a seperate download from the rest of the sets..


Yeah, that's what I was saying, I have a feeling that this is still early for the performance numbers. And while you promoted the unified drivers supporting legacy junk, I don't thin it's a wise choice, they already had issues with the 9x.xx seires drivers where people had to revert back to 8x.xx, both nV and ATi should simply move to primary support of the current generation, and bug fixes of the previous. There's also no point to have a 100+MB driver for a GF4 when the only benifit over the much smaller previous drivers is all G80.

Quote:
They will probably work on video later, right now they need to get the most games working well and improve performance as much as they can.


Actually they don't need to improve performance on the GTX, maybe a little bt better AA speed on the GTS, but performance is already tops, why bother other than for e-peni$ concerns? Video is where they need to focus next, and of course always focus on bugs and stability issues if they come up.
a b U Graphics card
a b Î Nvidia
November 9, 2006 8:22:16 PM

And that's the thing, the way that the GTS does in Oblivion shows that really the traditional old titles, like HL2, D3/Q4, etc aren't going to expose the strenghts of the new architecture.

I think Oblivion is a perfect example of the benifits, F.E.A.r may also expose this somewhat, but really Oblivion makes such heavy use of all Pixel, Vertex and Texture so it's a great title to expose the difference between old and new.
November 9, 2006 8:50:56 PM

Thanks - that seems to make sense.
November 9, 2006 10:06:08 PM

You're welcome. I try and explain what I can.... but I don't try and step on Grapes toe's, tried that one bad idea lol :wink:
November 9, 2006 10:21:47 PM

Sounds like a bad case of OCD to me. :wink:
November 10, 2006 7:18:11 PM

I have a couple issues regarding your GeForce 8800 review.

The first being the rating of the video cards when scoring is based on the red bar (ie/ the 1024x768) score which is the most CPU limited. Take the 3DMark 2005 score for example with 4x AA and 8x AF. The ATI Radeon loses for 1024x768 which could be related to the chipset while it beats the 8800GTS at 2560x1600.

The second issue is with the quality comparisons for Oblivion, I have looked closely at both images and flipped them back and forth and there is no significant difference other than swaying grass. This is far from "far superior" as you have commented. You have stated no objective facts in this comparison, and the subjective comparison that you did provide could be attributed to favortism of NVidia.

The third issue and arguably the worst of them all is in Beyond Graphics when you showed the HQV Video Benchmark results. There are two problems with what you presented, the first being the graph you provided. From the looks of it, it would appear at first glance that the NVidia solution is 3x better than the ATI solution. This is not the case, however, since the graph starts at 105 and thus provides a very biased view of NVidia. This image gives the view that NVidia is 300% better at video quality than ATI as opposed to the truth of 8%. The second issue with these results is that the HQV tests are subjective tests. The results are so close that NVidia could be higher simply due to bias.

The final issue is that the comparisons made no note of the fact that this is not a comparison between apples and apples. The G80 is next gen and the ATI solution is not. I agree that you had to compare it to something, but should have perhaps included a couple sentences in the conclusion that made light of such a fact and indicated that the true story would only be told when ATI releases the R600.
a b U Graphics card
a b Î Nvidia
November 10, 2006 7:46:11 PM

You know I didn't even take note of the HQV numbers because.... IT'S an NV SLIDE! Not a test that THG did like Crashman did for his review, so I jumped over it like I would an ATi provided slide (that has nothing to do with architecture).

Personally I would've expected that some testing be done like Crash did, and like Firingsquad and others do. Crash even compared the results with optimizations on and off. Did nV combine the optimized and un-optimized scores to get their final number?

I have to agree with IncinX and ask why that was included? It would be the equivalent of nV providing the gaming benchmarks on their test system for THG to republish instead of doing the actual tests.

I don't have a problem with the graph, it conveys what it does, but the fact that it was simply supplied and not derived by THG's tests is pretty surprising.
November 11, 2006 12:16:22 AM

Quote:
Oh boy. The long drawn out battle posts with Ape. Sometimes I think you just use the tactic of wearing someone out! LOL


But they're facts, and each one is in reply to your statements, which get the history wrong, you can go over it if you want, but the facts remain as stated by me. Since your reply was to my statement about the GF7 coming out and being on top of the IQ podium, then losing that spot, then everything I said applied now too. Same issue, same concern.

Your facts arent as correct as you wish them to be.. :roll:
The way you speak there is as if you are the ultimate authority, you are partially right but you arent speaking some great truth here.

There is no definitive answer on the matter if X1900 IQ is > Geforce7 IQ.
Like there is X1900 IQ < Geforce8 IQ.

THATS the fact of the matter. Not all this nitpicking you present.


Quote:
You can make it personal if you want, but at no point did I comment on you bonking your head. Try to keep within the lines of the hardware, eh! :tongue:


I was referring to GF7 vs X1900.


I wasn't and you replied to my statement. IQ went X800->GF7->X1K exactly like I said.

You werent? Well you need to be more SPECIFIC then.
Instead of leaving your points wide open for interpretation.

It makes total sense to me that you wouldve been referring to the last gen of cards (X1900 vs GF).

Quote:
All this conversation you posted above doesnt mean anything.
In the end, NV had better AA... ATI had better AF.


Had better AA until the X1800 came along.

X1800s have better AA than 8xS? Really? Please link me.
NONE of ATIs AA modes are superior to 8xS.
8xS is the only way to get substantial visual improvement over 4X.

Dont take my word for it.. not that you ever do.
I've just owned nearly every NV product ever produced, and many ATI cards. So heres a review, since you are so demanding on links.

"But I'd like to note that the quality difference in modes higher than MSAA 4x goes down to zero, especially in dynamic games. While 8xS mode in GeForce cards still makes some sense (texture antialiasing and "side-effect" anisotropic filtering), you will have to use a magnifying glass to find differences between 6x and 4x modes in ATI cards. That's why it's hard to say whether all these crazy modes (like ATI CF 14x or NV SLI 16x) make any sense..."
Source

4x vertex AA and 2x texture AA FTW.

Quote:
I used 8xS. It was very playable in certain games and resolutions..


And that was my point if you look back, SSAA was nice, but the performance penalty was worse than going up another resolution notch with plain MSAA.So how is that a useable benifit? The penalty of HQAF was noticeable, but nowhere near the same as SSAA and the difference was more noticeable especially in titles like Oblivion. And with SSAA available on the X1K series then they became equal, so like I said GF7 took the crown, then ATi took it back. Now the GF8 has it, seems pretty straight forward.

The performance hit was worse on 8xS than 6X because 6X isnt nearly as good as 8xS, Ape.

8xS > 6XAA. In case you missed my response above, which also pertains to this... they say you need a magnifying glass between 4x and 6x??? I think that is certainly dimishing returns...

Quote:
The results did match.. there was never a clear cut winner in the GF7/X19 race for IQ. I'm really sorry. :roll:


Only to somone who didn't care aout IQ, like you've always stated. To everyone else it was clear, ATi had better AF, ATi had better AA for the same values and had additional options, like Temporal AA.

Quote:
To sum a long story short. There really hasnt been a clear cut IQ champ.. at least not as clear cut as this Geforce8.. in the history of GPUs. Possibly 3dfx vs early ATI (which wasnt to hot).


Nah Matrox beat them both, even in gaming for IQ it was FPS they had. Even when the Parhelia came out it had the IQ, but not the speed.

Matrox is far to slow. I can forgive performance detriments if theres an IQ bonus.. but not if the performance is as far off as Matrox's historically has been.
Speed + IQ = G80 = teh win.

Quote:
Or R300 vs FX. Even there though, including the FX AF.. there wasnt a head and shoulders winner.


Well there wasn't in theoretical benchmarks, but in actual gamep,ay the R3xx series lead the whole way.

Hey, you are the one bragging up MATROX for IQ right?

The FX series had better AF IQ than ATI has ever had. Now, G80 has FX AF.
Your all about slow Matrox with great IQ.. but wont give the FX AF a bone it deserves? Hmmmmmm..

Quote:
NV AF was inferior. Thats true, but whats the point. NV AA was still superior..


Until it was bested. You show me a single review that pits the GF7 as being head and shoulders above now. You now have the GF8 to sooth your pride, but you're defending the undefendable just like the FX, which is pointless.

Geforce 7 being head and shoulders above the X1900 in AA?
Check the link provided earlier Ape.

Heres your one single review.

Quote:
and shimmering was -not- cured by either the x1900 or GF7 series.


Shimmering was not an issue on the ATi cards when HQAF was enabled, it even cleared up when going from default to quality, but to make matters worse for the GF7 nV kept the drivers floptimizations from bringing their 'quality' setting in line with ATi's similar 'quality' setting, so it was even worse, and then using HQ there wasn't shimmering.

Ah watch the verbage.. shimmering was "not an issue"??
Hmmmm, OK.

Yet, shimmering still exists on ATI, no matter what you do. Yes, it can be largely reduced to a very significant degree (on both).

Quote:
If any of the X800/x1800/x1900 had accomplished that, I think people wouldve came to a concensus that ATI was the crowned champ of IQ.


You show me the nV shimmering issue on ATi cards with HQAF enabled. Even Anand won't try to say that.

You dont get it do you, there is no "NV" shimmering issue.
They BOTH (X1900/GF7) have it. NV just worse than ATI.

If you seriously demand I go find a damn link to prove it to you, I will.. but you're wrong. It exists on both.. sorry d00d.

Quote:
But shimmering still existed on both to an extent. Yes far less on ATI. But still existed, making it less of a celebration for ATI as "victory".


Seriously show me a link, that supports that.

Link
I dug this up really quickly, because this is silly to deny ATI shimmering remains. I read forums everyday, and many of them.. and I've never seen anyone attempt to spin ATI into 'not having shimmering effects'...
before you, that is.

Quote:
but it was hard to fault them to the point of not using Nvidia because they had their IQ advantages as well.


Quote:
Perfect as in, as good as it probably is going to get. I'd be shocked to see a card with better AF and better performance.. It will happen but this is so close to being perfect that its astounding.


It is impressive, but it's far from perfect, but it's the best we have sofar, but like I said the difference between G80AF and X1K AF is less than the difference between X1K AF and GF7 AF; and anything that would beat the G80 will be even less of a difference, because there's not much room left to improve, but it's obvious that it's not perfect.

Far from perfect? LOL regardless of your semantics.. this is prob as close to "perfect" as we will see. My crystal ball is broken, but I guess we'll see if your messiah ATI can do better. :lol: 
I'm waiting for their vaporware with anticipation. Lets rock.

Quote:
I think the D3D tester is proof enough in AF domination.


Except in a game it's exrtremely minor difference, look at [H]'s review to see how little the difference is.

Quote:
As far as testing everything out, such as video.. I'll be able to do that myself soon enough.


You'd need something to test against head to head, and you don't even have the GF7 anymore for that. BTW, I still wouldn't consider you an unbiased observer.

No one is buying these $650 cards for video. They are buying them for the worlds best IQ + performance + featureset.

As far as video, I'm not sure on that.. but I can guarantee you that when they get all the drivers sorted out it will be as good as anything in the past.. and better.

The Luminex Engine has a full 10-bit display pipeline, including: VC1 and h.264 HD spatial temporal deinterlacing / inverse telecine; hd noise reduction; hd edge enhancement.
1 Billion colors vs 16.7 million on X1900s and GF7s. Not to mention, Purevideo HD and its HDCP capable.. w00t!

Quote:
You should get one ;)  Test yourself :D 


Nah, unless they make a PCMCIA version ain't gonna happen, if GW and everyone else can't get me to go back to desktop the G80 isn't about to do it.

Well, it sucks to be into desktop if you run ATI. 8)
I'd do the same thing, the R300 glory days are long, long over..

Quote:
But the card is obviously spectacular, but I am interested in seeing what changes/improves with newer driver revisions. As the driver is currently a seperate download from the rest of the sets..


Yeah, that's what I was saying, I have a feeling that this is still early for the performance numbers. And while you promoted the unified drivers supporting legacy junk, I don't thin it's a wise choice, they already had issues with the 9x.xx seires drivers where people had to revert back to 8x.xx, both nV and ATi should simply move to primary support of the current generation, and bug fixes of the previous. There's also no point to have a 100+MB driver for a GF4 when the only benifit over the much smaller previous drivers is all G80.

I'm not going to argue here. I'm starting to see things your way on this one.
But I dont like the idea of dropping support.
Maybe 1 legacy GF7 and <, and another for the current product line.

But I still dont agree, NV has traditionally had unified drivers and kept file size down to 40MB or so. Lower than ATI has been in the past.

ATI supports back to the 9500 with the Cat 6.10s on a 34MB total driver package download, plus another 10MB Hydravision download.
Nvidia fits their superior multimonitor support and driver package in a 40.5MB single, simple file.
NV, (before the G80) supported back to the GF2MX and all the Quadros.
ATI supports back to the 9500s.

I'm guessing eventually G80 support will be integrated into their main driver package on the NV side.

But I'm starting to get like you though, and not care as much.
I think if theres a need to make a break.. G80, and G70 and < is probably the spot due to architectural differences.

Quote:
They will probably work on video later, right now they need to get the most games working well and improve performance as much as they can.


Actually they don't need to improve performance on the GTX, maybe a little bt better AA speed on the GTS, but performance is already tops, why bother other than for e-peni$ concerns? Video is where they need to focus next, and of course always focus on bugs and stability issues if they come up.

Its not epenis.. you pay $650 for a video card and you have the right to get all out of the card that you can get.
The faster it is, the more it drags ATI and its supporters face in the mud. Not to mention, it needs to be tweaked out for people with big money screens that have high res.
November 11, 2006 1:07:18 AM

I have mine installed after relocating one of my HD's and have one thing to say this thing heat my computer room my 2 degrees. It's quite but hot as hell.
!