GeForce 8800: Here Comes the DX10 Boom

pschmid

Distinguished
Dec 7, 2005
333
0
18,780
Nvidia has been working with DX10 for as long as Microsoft has been developing the standard. Today, what we get is G80, otherwise known as GeForce 8800GTX. Unified DX10 shaders never looked better!
 

dean7

Distinguished
Aug 15, 2006
1,559
0
19,780
This thing really looks amazing. As soon as the price drops a bit (I'll be ready for a new build then) I'm going to pick one up! (Unless ATI's offering turns out to be better)
 

dean7

Distinguished
Aug 15, 2006
1,559
0
19,780
I wonder if CUDA spells death to physics processing units such as AGEIA PhysX
Very possible, being as how the gaming community "buy-in" on the PhysX seems to be pretty low anyway. From what I've seen, it looks as if physics in DX10 games (such as Crysis) are more advanced than what the PhysX card has to offer anyway.
 

Chil

Distinguished
Feb 20, 2006
192
0
18,680
One thing I don't agree with in that article. It's a personal opinion, but I honestly can't say the 8800GTX has better image quality than the X1950XTX based on the supplied Oblivion screenshots. The author writes:

This image is far superior in quality compared to that of the ATI. It looks like the tables have turned in that department.

If anything, ATi still looks better to me, mostly because I like the contrast better with the ATi screen.
 

dean7

Distinguished
Aug 15, 2006
1,559
0
19,780
Well, I think this article does a better job of showing the G80 performance:

http://enthusiast.hardocp.com/article.html?art=MTIxOCwxLCxoZW50aHVzaWFzdA==

(BTW, I can't wait to see Oblivion running in high res with everything maxed out!)
 

caamsa

Distinguished
Apr 25, 2006
1,830
0
19,810
Nice card but really how many people play at resolutions above 1280X1024. Anything above that is gravy. But it is a nice card maybe I will buy one in about a year. :wink:

BTW I can't tell the difference between the Oblivion screenshots of the ATI and Nvidia card. Maybe I need to get my eyes checked?

They sure do like to tease us with nice screen shots but when are all of these games going to come out? 2008?
 
G

Guest

Guest
euh, anyone with a LCD screan bigger then 19" and a lot of people with a CRT of 19"+.
 

caamsa

Distinguished
Apr 25, 2006
1,830
0
19,810
Yes true but I would say a larger screen/higher resolution is in the minority while 1280X1024 and 1024X768 are the common resolutions that people use. That will probably change in a year or two when larger LCD monitors drop in price.
 

amdwilliam1985

Distinguished
Mar 30, 2006
390
0
18,780
nice great article.
anyone knows when will 8600gt come out? I paid 165$ for my opteron 165, there is no way I'm going to pay more than $200 for a graphic card no matter how good it's going to play with games.
I think there are a lot of people with the same standing as me. Let's wait and see how 8600gt turns out.
 

JonathanDeane

Distinguished
Mar 28, 2006
1,469
0
19,310
nice great article.
anyone knows when will 8600gt come out? I paid 165$ for my opteron 165, there is no way I'm going to pay more than $200 for a graphic card no matter how good it's going to play with games.
I think there are a lot of people with the same standing as me. Let's wait and see how 8600gt turns out.

Sounds good to me I think the low end of DX10 cards will be fine for a while untill the software takes advantage in something like 1 year hmmm maybe a little more if its harder to program for or less if its easyer ? (I know I know but dont post those Master Of The..... pictures lol) Believe it or not im still happy with my X1300 *shrugs* These newer games are sure to make me want for more though eventualy. I usualy dont upgrade my graphics card untill I run into something I cant run right lol
 

DPolkowski

Distinguished
Mar 9, 2006
51
0
18,630
I wrote this to an individual that sent me an email earlier today... perhaps this clears up anything I said about the image quality.

"It is tough to get the impression from a screenshot. The true quality is in the interaction in the game. I love that scene in Oblivion because of all that is going on. There are trees, flowing grasses and flowers, mountains in the background, and a strong mixture of town to the left and open air to the left. Where the quality can be seen the most is when you leave the scene alone for 2 minutes when the camera starts to circle the character. When this happens you can see the shadows of the character and horse on the pixel shader generated grasses below. The shadows on Nvidia 7000 series hardware would dance all over the place and actually could make you motion sick on a large resolution like 2560x1600. ATI has always looked better in real-time in this game (Quad, traditional SLI, or single card). When I plugged in the reference 8800GTX for my first view of 1024x768 (all the settings up) I thought to myself, "Wow, they really changed their image quality." I immediately ran the X1950XTX to see how it faired. I can say I am impressed with the quality improvements over the 7000 and 6000 series cards.

JPEG files are terrible for "proving" image quality. The place you can see it is in real-time with all of the objects in motion. There is where you see if the frames move smooth and whether they are aliased (jagged) or not.

If you have any suggestions of comments on this, please feel free to follow up on this."

I really like what Nvidia has done with GeForce 8800 and in terms of image quality, Nvidia has clean their act up.
 
JPEG files are terrible for "proving" image quality. The place you can see it is in real-time with all of the objects in motion. There is where you see if the frames move smooth and whether they are aliased (jagged) or not.

Could always use PNG files, but it depends on what you're trying to show. For quality of AF a static shot should be good enough, for the ability to smooth the transistions with AF enabled would require moving images of course (same as to show shimmering, crawl, etc on the GF7).

And while I agree that the image quality of the GF8800 is much better than the GF7/6 series cards, I wouldn't say the difference between the GF8 and X19 is anywhere near as obvious from anyones stills, and the benifit of the new AF should translate better. The only thing I notice was [H]'s review showed a slight chainlink improvement in HL2, but their image also had worse railroad ties IMO. No doubt the nV is superior or equal at worst, but I wonder if we're getting closer to the well ATi is 99% there, and nV is now 99.44% perfect, whereas the GF7 was in the 80s.

Like I mentioned before the possibility for high IQ is one thing (as seen in the FX series),but the actual implementation is something else.

No doubt though, better or equal AF, better or equal AA, plus better performance at higher resolutions, all are pluses, and really makes it a nice change of pace especially now that they also have the FP16+HDR w/ AA option in hardware.

Pretty much positive all around, the only reservation that gets me is that everyone was all happy about the IQ of the G70 until the R520 came out, and then people said, OH! That's what's missing.

Still wanna see more tests, but everything sofar is pretty dang good!
 

rampage

Distinguished
Jan 28, 2005
137
0
18,680
Your darn tootin its good.

King of IQ and King of Performance. All your base are belong to us.

I ordered a Evga 8800GTX today with 2 day shipping. Might get here Friday but thinking probably Tuesday.

With the worlds greatest card out, and another exceptional Nvidia hard launch.. I had to throw down for this one.

First time I've spent over $450 for a GPU. First time its ever really been worth it.
I didnt buy into the 7950GX2 craze, the QuadSLI um... 'craze', or even SLI anymore.. but I can say I've owned SLI (and at SLI launch to boot). No comment on Crossfire.. I def wasnt dumb enough to buy into that knee-jerk reaction multiGPU implementation from ati.
I really love how it cant properly render 2560x1600 in Oblivion properly still, today. Either way, its now slow and a thing of the past.
But my SLI days are probably over. One 8800GTX is enough for me till G81.. then the 8900GTX will be enough till the G90s.

I salute Nvidia and salute Jen-Hsun Huang for an exceptional product execution. This is what I'm talking about when I talk about Nvidia. First class. In ya face like KaPow! :D

In the words of Mr. Duke Nukem himself... Hail to the king baby :twisted:

I'd like to add comments from a poster at OCP-
Way above expectations. Right up there with fond hopes, and a dash of spicy mustard on top! Not "just as good as ATi's HQ AF," but better! Not just HDR+AA, but new totally rockin' AA modes. A card with no asterisks, "Well it's good except for...."


And future potential yet unknown. No "better drivers may get it up to what we expected"--instead, "better drivers will make this sick pig even sicker!"

QFT!!!! :!:

Oh, and check this out... you'll laugh. :)
Link
 

rampage

Distinguished
Jan 28, 2005
137
0
18,680
Pretty much positive all around, the only reservation that gets me is that everyone was all happy about the IQ of the G70 until the R520 came out, and then people said, OH! That's what's missing.

Oh, and this is because ATI wasnt ever the definitive IQ king... as G80 is today.
In the days of yore- NV typically had better AA and ATI had better AF.

Most people went out on a limb and said the visual difference between ATI AA and NV AA wasnt nearly the large difference between ATI AF and NV AF... thus, ATI was crowned "IQ king" (by some). But this was no clear cut debate. Esp when many sites who dissected ATI and NV drivers and IQ recognized that Nvidia drivers actually did more work on a scene than ATI.

This is all ancient history now, so hardly worth drudging up. But that was what came to mind when I read about your reservation.

I will admit, at first I had the same "reservation" but the answer is quite clear to me- there never was a definitive all-out hands-down IQ King, before G80.
So yes, people didnt say "oh thats what we were missing!" before because everyone knew what NV was missing.. but it was hard to fault them to the point of not using Nvidia because they had their IQ advantages as well.
 
Pretty much positive all around, the only reservation that gets me is that everyone was all happy about the IQ of the G70 until the R520 came out, and then people said, OH! That's what's missing.

Oh, and this is because ATI wasnt ever the definitive IQ king... as G80 is today.
In the days of yore- NV typically had better AA and ATI had better AF.

Very selective memory you have there, when the GF7 came out the X850 still had better standard AA, and when the X1800 came out it continued the better AA with the added levels. The only time nV even had an edge was with the addoption of Transparency AA, and then that got negated too. nV's SSAA was the only thing comparable, but at a heavy performance cost, so it was never worth enabling versus upping the resolution.

Most people went out on a limb and said the visual difference between ATI AA and NV AA wasnt nearly the large difference between ATI AF and NV AF... thus, ATI was crowned "IQ king" (by some). But this was no clear cut debate.

Wan't even a question of going out on a limb since the difference was between useable feature HQAF and unuseable feature SSAA.

Esp when many sites who dissected ATI and NV drivers and IQ recognized that Nvidia drivers actually did more work on a scene than ATI.

Many sites? Like who Anand? Doesn't matter how much 'more work it does' if the results don't match that work.

This is all ancient history now, so hardly worth drudging up. But that was what came to mind when I read about your reservation.

If that's what comes to mind then you really need to reconsider what you remember from the era. The other thing to remember about GF7 AA was no FP16 HDR + AA in hardware, so really what is it I'm missing about their being deserved reservation about what we don't know yet (like DX10 features, support, etc).

I will admit, at first I had the same "reservation" but the answer is quite clear to me- there never was a definitive all-out hands-down IQ King, before G80.

Yeah but you're missing what I'm saying, prior to the release of the X1800, the GF7800 had the same angle-dependant AF as the X800, and similar AA with option for extra. So at the time it too could be considered as the hands-down king of IQ... until the X1800 showed that they still had work to do. The GF80 beats the X1900 no argument there, and definitely the GF7 series, but whether it will keep the title or whether there is still more to be known about feature interaction is far from determined at this point. Like I said my reservation involves other possible issues that rarely get tested by reviewers, and usually wind up in either nVnews' forums (for nV issues) or Rage3D's (for ATi's issues). I suspect the next real test is when DX10 software and other compliantr hardware makes it to market to truely test the new architecture.

So yes, people didnt say "oh thats what we were missing!" before because everyone knew what NV was missing..

No they didn't, like I said, nV AF was fine at the time because no one knew that the AF on the R520 would be angle-independant and that it would matter that much with the shimmering. Also either no one knew or no one reported that there was the FP16HDR+AA limitation.

but it was hard to fault them to the point of not using Nvidia because they had their IQ advantages as well.

It wasn't about not using nV, it's about the current hyperbole about how wicked everything is (including yours about perfect AF [BTW you didn't notice that the AF still shows signs of feathering?]), and not the potential limiting factors.
This is not specific to any one card maker either, it's just about reviews in general, most are more extensions of ADs for the IHVs, not investigations into the actual hardware and features,functions,limitations. That's why still prefer B3D reviews because it's more about the features/functions than just FPS.

Anywho the G80 is undeniably the current leader of IQ, I'd like to see more tests for certain features (especially 2D video playback), but the true test will be once the intended DX10 features and hardware start getting full once overs to compare. That's why I'd hold off calling anything perfect.
 

rampage

Distinguished
Jan 28, 2005
137
0
18,680
Oh boy. The long drawn out battle posts with Ape. Sometimes I think you just use the tactic of wearing someone out! LOL



Pretty much positive all around, the only reservation that gets me is that everyone was all happy about the IQ of the G70 until the R520 came out, and then people said, OH! That's what's missing.

Oh, and this is because ATI wasnt ever the definitive IQ king... as G80 is today.
In the days of yore- NV typically had better AA and ATI had better AF.

Very selective memory you have there, when the GF7 came out the X850 still had better standard AA, and when the X1800 came out it continued the better AA with the added levels. The only time nV even had an edge was with the addoption of Transparency AA, and then that got negated too. nV's SSAA was the only thing comparable, but at a heavy performance cost, so it was never worth enabling versus upping the resolution.

I was referring to GF7 vs X1900. I'm not going to go over ancient history with you. Though you know I'm more than capable of doing so.. whats the point?

All this conversation you posted above doesnt mean anything.
In the end, NV had better AA... ATI had better AF.

Most people went out on a limb and said the visual difference between ATI AA and NV AA wasnt nearly the large difference between ATI AF and NV AF... thus, ATI was crowned "IQ king" (by some). But this was no clear cut debate.

Wan't even a question of going out on a limb since the difference was between useable feature HQAF and unuseable feature SSAA.

I used 8xS. It was very playable in certain games and resolutions.. also with SLI if one had it.
Its a limb. It broke and you fell (apparantly on your head).

Esp when many sites who dissected ATI and NV drivers and IQ recognized that Nvidia drivers actually did more work on a scene than ATI.

Many sites? Like who Anand? Doesn't matter how much 'more work it does' if the results don't match that work.

The results did match.. there was never a clear cut winner in the GF7/X19 race for IQ. I'm really sorry. :roll:

This is all ancient history now, so hardly worth drudging up. But that was what came to mind when I read about your reservation.

If that's what comes to mind then you really need to reconsider what you remember from the era. The other thing to remember about GF7 AA was no FP16 HDR + AA in hardware, so really what is it I'm missing about their being deserved reservation about what we don't know yet (like DX10 features, support, etc).

I will admit, at first I had the same "reservation" but the answer is quite clear to me- there never was a definitive all-out hands-down IQ King, before G80.

Yeah but you're missing what I'm saying, prior to the release of the X1800, the GF7800 had the same angle-dependant AF as the X800, and similar AA with option for extra. So at the time it too could be considered as the hands-down king of IQ... until the X1800 showed that they still had work to do. The GF80 beats the X1900 no argument there, and definitely the GF7 series, but whether it will keep the title or whether there is still more to be known about feature interaction is far from determined at this point. Like I said my reservation involves other possible issues that rarely get tested by reviewers, and usually wind up in either nVnews' forums (for nV issues) or Rage3D's (for ATi's issues). I suspect the next real test is when DX10 software and other compliantr hardware makes it to market to truely test the new architecture.

To sum a long story short. There really hasnt been a clear cut IQ champ.. at least not as clear cut as this Geforce8.. in the history of GPUs.
Possibly 3dfx vs early ATI (which wasnt to hot). Or R300 vs FX. Even there though, including the FX AF.. there wasnt a head and shoulders winner.

So yes, people didnt say "oh thats what we were missing!" before because everyone knew what NV was missing..

No they didn't, like I said, nV AF was fine at the time because no one knew that the AF on the R520 would be angle-independant and that it would matter that much with the shimmering. Also either no one knew or no one reported that there was the FP16HDR+AA limitation.

NV AF was inferior. Thats true, but whats the point. NV AA was still superior.. and shimmering was -not- cured by either the x1900 or GF7 series. If any of the X800/x1800/x1900 had accomplished that, I think people wouldve came to a concensus that ATI was the crowned champ of IQ.
But shimmering still existed on both to an extent. Yes far less on ATI. But still existed, making it less of a celebration for ATI as "victory".

but it was hard to fault them to the point of not using Nvidia because they had their IQ advantages as well.

It wasn't about not using nV, it's about the current hyperbole about how wicked everything is (including yours about perfect AF [BTW you didn't notice that the AF still shows signs of feathering?]), and not the potential limiting factors.
This is not specific to any one card maker either, it's just about reviews in general, most are more extensions of ADs for the IHVs, not investigations into the actual hardware and features,functions,limitations. That's why still prefer B3D reviews because it's more about the features/functions than just FPS. [/quote]

Perfect as in, as good as it probably is going to get. I'd be shocked to see a card with better AF and better performance.. It will happen but this is so close to being perfect that its astounding.

Anywho the G80 is undeniably the current leader of IQ, I'd like to see more tests for certain features (especially 2D video playback), but the true test will be once the intended DX10 features and hardware start getting full once overs to compare. That's why I'd hold off calling anything perfect.
I think the D3D tester is proof enough in AF domination.
As far as testing everything out, such as video.. I'll be able to do that myself soon enough.

You should get one ;) Test yourself :D

But the card is obviously spectacular, but I am interested in seeing what changes/improves with newer driver revisions. As the driver is currently a seperate download from the rest of the sets.. its clear its still cutting edge software development at Nvidia for this card. They will probably work on video later, right now they need to get the most games working well and improve performance as much as they can.
 

greyhound81

Distinguished
Nov 9, 2006
13
0
18,510
Ima invest in one of these new cards with my new system, but it needs to be pretty much silent..can i get watercooling for the 8800GTS?
 

ekant

Distinguished
Aug 27, 2006
2
0
18,510
wat about power requirement? shudnt d companies involved do smething about reducing power requirement.

Reducing transistor size will not only reduce power req. but also give them better yield which may either give them better profit or reduce price for more sales (good 4 us) it a win win situation 4 all of us.

I am happy wid the graphic solution I hve dat way I m not troubling the environment and still be able to play the oldies. they maynot have realism but wat d heck atleast they have replay value (atleast 4 me)