Sign in with
Sign up | Sign in
Your question

Another dissapointment out of new graphics hardware...

Last response: in Graphics & Displays
Share
November 9, 2006 6:26:30 PM

well i THOUGHT id get the 8800gtx, but looks like im wrong...41fps in oblivion on 1024x768 ...thats exactly the same as the 7950 performed...41 fps on lower resolution is not exceptable for a peice of hardware that costs almost 700 dollars. these companies will get it sooner or later im sure, make the video cards god damn power worthy of its high price tag. The other benchmarks mean nothing considering they are LAST generation games, everything i can run max with my 7800 gtx over 60fps at max/high settings. hype for nothing. also, seeing these screen shot split screens of in games ...between the 1900xtx versus the 8800...4xAA versus 16x anti aliasing with the geforce 8800...the picture title saying "WOW LOOK AT THE AMAZING DIFFERENCE"...i saw no difference what so ever with the naked eye.
November 9, 2006 6:50:50 PM

Quote:
well i THOUGHT id get the 8800gtx, but looks like im wrong...41fps in oblivion on 1024x768 ...thats exactly the same as the 7950 performed...41 fps on lower resolution is not exceptable for a peice of hardware that costs almost 700 dollars. these companies will get it sooner or later im sure, make the video cards god damn power worthy of its high price tag. The other benchmarks mean nothing considering they are LAST generation games, everything i can run max with my 7800 gtx over 60fps at max/high settings. hype for nothing. also, seeing these screen shot split screens of in games ...between the 1900xtx versus the 8800...4xAA versus 16x anti aliasing with the geforce 8800...the picture title saying "WOW LOOK AT THE AMAZING DIFFERENCE"...i saw no difference what so ever with the naked eye.


??? That's like buying a really expensive sports car and saying, "I'm surprised I can not fit all of my groceries in it like I could in my van. Jeez I just spent a hundred thousand on a car and where do I put the baby seat?'

Who would buy this to play at such low resolution? Obviously to take advantage of the superior performance you'll need to use superior equipment (ie superior monitor).
November 9, 2006 7:03:56 PM

^^Sad but true. I found out the same thing when i went SLI 2 years ago....
Related resources
November 9, 2006 7:10:20 PM

All I can say is ... train that naked eye dude. Train it hard.

What did you expect? 400FPS in Oblivion at 1600*1200 from hardware that has a roughly 50% price premium over the current ATI flagship?
Why would it be reasonable to expect miracles?

We're dealing with hardware designed to take advantage of DX10, not DX9. Looking at the currently available benchmarks I think it's not a lie to state that the 8800 gives a very solid performance/price ratio when compared to any other high end card currently on the market, including the x1950xtx and the 7900gtx. In fact, unless ATI is able to counter with a DX10 board of their own fairly quickly, I expect those 8800 prices to go up, or the ATI boards to be driven down in price (at least the very high end models).

Performance is always relative, and in this case the results are relatively excellent. You have to remember that aside from the unfified architecture, there were no significant hardware breakthroughs made with this board - it's main feature is DX10 compatibility. Think of it as a beefed up 7900GTX that's been future-proofed and designed to take the single-gpu speed crown away from ATI, for the meanwhile. Nothing unfair about that :) 
November 9, 2006 7:12:48 PM

Quote:
well i THOUGHT id get the 8800gtx, but looks like im wrong...41fps in oblivion on 1024x768 ...thats exactly the same as the 7950 performed...41 fps on lower resolution is not exceptable for a peice of hardware that costs almost 700 dollars. these companies will get it sooner or later im sure, make the video cards god damn power worthy of its high price tag. The other benchmarks mean nothing considering they are LAST generation games, everything i can run max with my 7800 gtx over 60fps at max/high settings. hype for nothing. also, seeing these screen shot split screens of in games ...between the 1900xtx versus the 8800...4xAA versus 16x anti aliasing with the geforce 8800...the picture title saying "WOW LOOK AT THE AMAZING DIFFERENCE"...i saw no difference what so ever with the naked eye.


??? That's like buying a really expensive sports car and saying, "I'm surprised I can not fit all of my groceries in it like I could in my van. Jeez I just spent a hundred thousand on a car and where do I put the baby seat?'

Who would buy this to play at such low resolution? Obviously to take advantage of the superior performance you'll need to use superior equipment (ie superior monitor).

i see what your saying, but thats not a good analogy. you get performance out of that car atleast...this card, not so much..theres no point in playing at any higher resolution that 1600x1200...hell for that matter, nto even 1280x1024, i saw absoultely no difference, except for a decrease in frames... i want to play oblivion perfectly smooth, as video games should be played, with all eye candy and a good resolution at 1280x1024, and i still cant do that with a 700 dollar card.
November 9, 2006 7:13:36 PM

It'd be stupid to buy that card to play at lower resolutions like 1024X768. It really shines at higher resolutions. I do agree about the image quality though. I couldn't see any difference between the two screen shots. It could just be one of those, "you had to be there" type things.
November 9, 2006 7:17:07 PM

Quote:
All I can say is ... train that naked eye dude. Train it hard.

What did you expect? 400FPS in Oblivion at 1600*1200 from hardware that has a roughly 50% price premium over the current ATI flagship?
Why would it be reasonable to expect miracles?

We're dealing with hardware designed to take advantage of DX10, not DX9. Looking at the currently available benchmarks I think it's not a lie to state that the 8800 gives a very solid performance/price ratio when compared to any other high end card currently on the market, including the x1950xtx and the 7900gtx. In fact, unless ATI is able to counter with a DX10 board of their own fairly quickly, I expect those 8800 prices to go up, or the ATI boards to be driven down in price (at least the very high end models).

Performance is always relative, and in this case the results are relatively excellent. You have to remember that aside from the unfified architecture, there were no significant hardware breakthroughs made with this board - it's main feature is DX10 compatibility. Think of it as a beefed up 7900GTX that's been future-proofed and designed to take the single-gpu speed crown away from ATI, for the meanwhile. Nothing unfair about that :) 

No i expected atleast 50fps and up on max settings at a resolution of 1280x1024...and that wasnt even accomplished..after oblivion has been out for months. no it doesnt give a good performance to price ratio...the card should be no more than 400 bucks on release, along with every other card. the game got good benchmarks...IN LAST GEN GAMES..if my 7800gtx can run the games it tests over 60fps, max settings and a resolution of 1280x1024....but struggles in oblivion, this card is most defenetly, not worth that huge chunk of change.
November 9, 2006 7:21:26 PM

Whether a video card is worth 640$ or not, is a question of personal taste. Personally I would be unwilling to spend that much on a card, even if it triples my current framerates.


Also, would you be so kind as to link those benchmark results that show the 8800 doing worse than current gen cards? Are we only taking about Oblivion here? Please link to some page that contains more than a single benchmark so we could see the big picture.

Thanks
November 9, 2006 7:23:40 PM

If you look @ the reviews, you know it is actually worth it...

However, the price is very high! Would never pay it myself, but if it performs so much better than the X1950XTX, then you can also charge such a premium... That's the way it goes
November 9, 2006 7:27:15 PM

http://www.tomshardware.com/2006/11/08/geforce_8800/pag...

If this was the page that outraged you, you should perhaps consider that the problem here is a CPU bottleneck, since as you can see most cards perform the same at 1024*768 in the indoors benchmark. If one card has a 1fps advantage over another, it's insignificant and probably linked to some in-game event that happened in one test run and didnt happen in another.
November 9, 2006 7:27:22 PM

Quote:
Whether a video card is worth 640$ or not, is a question of personal taste. Personally I would be unwilling to spend that much on a card, even if it triples my current framerates.


Also, would you be so kind as to link those benchmark results that show the 8800 doing worse than current gen cards? Are we only taking about Oblivion here? Please link to some page that contains more than a single benchmark so we could see the big picture.

Thanks

you dont understand, other games benchmarks are irrelevant, those graphics are already considered to be old in my book (prey , doom, half life 2 lost coast etc.) not saying they dont look pretty good still, video card benchmarks should not be made with those games..hwoever oblivion is a game that still chews up everything, even with a core2xtreme. its on toms hardware..at 1024x768 res...8800 runs it at 41 fps...thats exactly the same as the 7950 in outside environments when that card was released...now that may be "exceptable frame rates" to some people....but when paying a small fortune for a "next gen" video card...i dont expect "exceptable" frame rates...i expect complete smoothness. and that expectation easily could have been met...but what, you think manufactureres are stupid? they will make these cards...everyone will buy them cuz they are the next best thing since sliced bread....then maybe , just maybe, the next batch of cards might be able to run oblivion at over 50 fps...but those will also be most likely over 600$ also.
November 9, 2006 7:28:06 PM

At low(er) resolutions, games are CPU bound and that's probably what is happening to you. Most of the benchmark sites these days use the best CPUs for good reason, to show the improvment of the GPU alone and not have the CPU bogging down the results.

If you lower the resolution enough, you can get by with even very basic cards and see the same or similar performances.

So, either, get a bigger monitor or get a better CPU if you want to see some tangible results.
November 9, 2006 7:29:44 PM

Well if you expect complete smoothness, try bud light.
Otherwise, be willing to accept the fact that "next gen" video cards are not all that different from "current gen" video cards with the exception of optimisations for DX10. Oblivion is coded for DX9.

Cheers
November 9, 2006 7:38:34 PM

Have you thought about something else possibly slowing it down? There may be something else in the system that's crippling the card.
November 9, 2006 7:39:42 PM

ppl with high end monitors would apreciate the power the new cards bring. there are wayy higher resolutions than 1600x1200 like 2560x1600 on 30 inch lcd monitors. obviously if you could afford a monitor of that size you should have the kind of money to buy this card. and you should research before you flame something. on lower resolutions a game's graphics is more cpu limited.
Anonymous
a b U Graphics card
November 9, 2006 7:40:04 PM

Ok first, you need to get a wide screen LCD with a high res, be 19X12, or whatever. Then play the game at higher res and with more eye candy and then you might actually see what its worth!.

Screen shot wont tell you a lot IMO, but I do definitely see a difference when playing with AA and AF and I do see the difference between the different card. If you don't then thats good for you! you don't need to upgrade! I find it funny that you 'want' to double the framerate, I think oblivion is playable at 40fps, and 80fps will do NOTHING for you in this game. So who cares, I'd take better IQ any day over adding Fps to anything around(40 or 60 depending on the type of games).

The nVidia folks just spent 475Million of R&D and 4years of their life developing that product, by chance the competition is not there, you really can't blame them for asking this kind of money! Heck it contains ~3X the transistor of any core2duo, that's expensive to manufacture, especially on a bigger process!

A good analogy would be: this car goes to the same speed on that crappy sandy road, it aint worth more! Well give it a nice highway strip and it will go much faster.

I don't see the point of your rant, if you don't see the need for this card, then just don't get it! And oh, if you don't see difference between 4XAA and 16AA, well run the game at freaking 4X and get twice the frame rate!

You can also consider this an investment because the 7950 wont be able to play upcoming tittle at half the speed of this chip.
November 9, 2006 7:41:56 PM

I agree the analogy was poor... A better one... you buy a sports car and complain because you still have to follow the same speed limit that you did when you were driving your old car.
November 9, 2006 7:45:45 PM

Like I said - be patient. Dont buy into the hype. If we all wait, prices will come down fast to $300. They will get there.
November 9, 2006 7:46:02 PM

The only things I flame are poorly argumented posts and my car dealership. Perhaps you should read my posts again :) 
November 9, 2006 7:50:33 PM

While I don't totally agree with the OP I w ould like to throw in my 2 cents. As far as the card not bieng powerful enough I do this it has plenty of power but as has been stated only at super high res. The one problem with that is the vast majority of people are not playing games at 1600x1200 most people use lower res around 1280x1024 maybe a little higher. That bieng said getting an 8800GTX is a waste unless you are running really high res on a huge lcd panel. But then again the 8800gtx is targeted at people who will gladly pay well over $1000 on a pc monitor.
Anonymous
a b U Graphics card
November 9, 2006 7:51:20 PM

Well I need to buy a whole computer before the December the 20th, because I sold my old one, and I can't live at home without a desktop, what should I do??????????

I got my E6600 and my Tuniq tower 120 already :D 
November 9, 2006 8:00:23 PM

well, I myself, cant see any reason to have a video card that can exceed the response rate of the human eye.. which is LESS than 24 fps... (thats the speed at which the frames in color movies are displayed)... do you see any flicker or stuttering on the movie screen??? Also, standard broadcast TV signals (not HD) is 30 interlaced frames per second.. dont see any stutter their either...

I can understand someone wanting to make the entire system as fast as possible to prevent the frame rates from dropping too low and causing stuttering or lag, but if we are pulling 60+ fps from a cpu/mobo/video card at max settings, why worry about getting another 10 or 15 fps out of the card if your not gonna notice it anyway? yea, you may get the display updated faster, but if they eye cant detect the change, why do it?
November 9, 2006 8:11:49 PM

Well the problem is slightly more comlplicated. I agree that in theory you shouldn't be able to tell between 28fps and 44 fps. However, I refer to you to

http://en.wikipedia.org/wiki/Frame_rate

to see why it's not always so.

Also, consider the fact that when we're talking about 30FPS, we usually imply "average". More often than not, a game will have varying degrees of intensity when it comes to the load on the GPU. A weaker GPU that might be pulling a 30FPS average might also be having a very low "minimum" frame-rate at times, which results in annoying game-play.
So whereas I agree that having frame rates like 90fps is nonsensical, I would prefer to always have a good magin of "safety" with my average frame-rates, to ensure that I dont see choppiness every 4-5 seconds.

Cheers
November 9, 2006 8:12:39 PM

i believe toms tests with a fx60...and not core2s, maybe that's why frames arent that high...
November 9, 2006 8:21:43 PM

Redwing

Yup, I agree with you there.. we have absolutely no need to have astronomical frame rates, and I do agree with you as to wanting/needing enough excess performance to cover for the 'rough spots' that the game will produce.

Most people cannot detect anything over 24 changes in light (lets get away from 'frame rate' for a sec) with the naked eye (or optically corrected if you need glasses).

While most modern film projectors (theater) use more than 24 shutter exposures per second, the film itself is only exposed during shooting at 24 fps (unless special effects such as slow mo or time lapse).


But, in the end, we are on the same page Redwing... just maybe different chapters! I agree we need to have peak frame rates over and above what the eye can see so that when the system gets bogged down some, there is enough extra headroom to keep the appearent frame rate up to snuff.
a c 271 U Graphics card
November 9, 2006 8:38:39 PM

My 2p's worth, I see where you're coming from, I too was a little disappointed with the lack of lower res oomph, for lack of a better phrase, as one who does everything at 1280x1024 or 1024x768 I was expecting a bit more than the numbers shown, now I know the card is aimed at the top of pile types who have 30" monitors and gold plated mice (but 'normals' can have one too If they like :lol:  ) , it's just that having seen the difference in F.E.A.R E.P, i.e. @1024x768 all settings maxed out, one 7900GT = 76 to 79fps and with two in Sli 114 to 116fps, the game is still mad but I was getting killed more smoothly :lol:  I would expect the same kind of increase if I were to replace my GT's with one of the 8800's.
November 9, 2006 8:39:44 PM

films have motion blur which can accomodate for low fps. Games do not and must render many stills. For stills its ~60fps that the eye cannot detect over.
November 9, 2006 8:44:43 PM

Quote:
films have motion blur which can accomodate for low fps. Games do not and must render many stills. For stills its ~60fps that the eye cannot detect over.


This is why the Doom3 Engine was bound to a 60 FPS maximum. (which can be disabled..but just to make my point)
November 9, 2006 9:01:43 PM

For the record, if you regularly use the elipsis (three-dots-in-a-row) and especially if you use it wrongly (as you do) then people will think you are a moron.

So stop using it.

Connect your thoughts.

Stream-of-consciousness stuff might impress your 3rd grade English teacher, but not anyone else.
Anonymous
a b U Graphics card
November 9, 2006 9:08:34 PM

lol... 8)

I use it too much myself :cry: 
a c 271 U Graphics card
November 9, 2006 9:10:10 PM

:lol:  Your avatar is very hypnotic after a while
November 9, 2006 9:57:36 PM

From some of the things I have read about Oblivian the reason it is so demanding is not so much the detail and draw distances but because of ineffeciant coding of the game itself, which is why the source engine from half life can pull much better fps on "lower end" systems.
a b U Graphics card
November 9, 2006 10:06:26 PM

Quote:
No i expected atleast 50fps and up on max settings at a resolution of 1280x1024...and that wasnt even accomplished..after oblivion has been out for months. no it doesnt give a good performance to price ratio...the card should be no more than 400 bucks on release, along with every other card. the game got good benchmarks...IN LAST GEN GAMES..if my 7800gtx can run the games it tests over 60fps, max settings and a resolution of 1280x1024....but struggles in oblivion, this card is most defenetly, not worth that huge chunk of change.


So how much did you pay for your 8800? If you're not happy with it, sell it to me! I'll give you $500USD for it right now thru PayPal.

Or, are you just going by what some review sites and random benchies show how the card performs? And if so, STFU!!!!!
November 9, 2006 10:07:34 PM

Oblivion is coded like crap for pc tbh

And unless I missed something the thread starter has failed to bring up the other hardware in his system. What's yer CPU?
November 9, 2006 10:22:04 PM

oblivion is crap period. character design sucks. the only thing that looks good is the land scape.
November 9, 2006 10:40:43 PM

Quote:

i see what your saying, but thats not a good analogy. you get performance out of that car atleast...this card, not so much..theres no point in playing at any higher resolution that 1600x1200...hell for that matter, nto even 1280x1024, i saw absoultely no difference, except for a decrease in frames... i want to play oblivion perfectly smooth, as video games should be played, with all eye candy and a good resolution at 1280x1024, and i still cant do that with a 700 dollar card.

What I don't understand is why you feel that the latest graphics HW is required to run some particular game "perfectly smooth" (I assume you mean >= 60 fps) at some arbitrary resolution with full graphics options turned on.

Rather than faulting NVIDIA for getting their due for achieving the performance crown, you should appreciate Bethesda for creating a game with so much content and advanced graphics options that it has the ability to bring more than one generation of HW to its knees.

EDIT: Code quality considerations aside, of course (I have not seen their code, so I'm not going to make any claims in that regard). But many agree it is a beautiful game, at least, and not limited to indoor "moon base" environments.

Maybe they'll still be selling "Oblivion" Christmas 2007/2008 to purchasers of GeForce 9900's/101000's (or whatever the next greatest will be called) who want to see what their new HW can do in a beautiful game. Hopefully, they will inspire more game developers to go beyond the norm.

If there's fault anywhere, it is with those game developers who are only satisfied with targeting current HW, which results in the supremely un-useful 400 fps benchmark results shortly after the games are released.

In the meantime, you will have to turn down enough graphics options to achieve your "perfectly smooth" experience, if you prefer that over the visual fidelity you don't appear to notice--on whatever HW you happen to be running.
November 9, 2006 11:43:28 PM

Quote:
oblivion is crap period. character design sucks. the only thing that looks good is the land scape.


I totally agree. Oblivion is a nice looking "game", once you've played it for more than 30 minutes you notice that the game is just empty, flat and totally without any atmosphere (imho of course).

The game does seem to be poorly optmized and perhaps quite CPU dependent. Like so many others have pointed out in this thread, the OP forgot that at the resolutions he's talking about, it's all about the CPU. You could have a Geforce 12800 or a ATI 30000 and you'd still get the same frames in Oblivion at that resolution with that CPU. Ya I know, I'm beating the dead horse...(..) ;) 
November 10, 2006 12:02:50 AM

This is where people buy into the hype.......

A Nvidia 7 series video card can generate 60~70 frames per second for Game A ~

The new Nvidia 8 series video card can generate 100+ frames per second for Game A as well ~

People see these numbers and think ~~~ WOW!!!

But the problem is that during a game, anything above 40 frames per second will give you a smooth gaming experience!!!

Its likely that people will have to wait till DirectX 10 games to see the real benefits of Nvidia 8 series video cards!!!!
a c 271 U Graphics card
November 10, 2006 12:10:12 AM

Quote:
Its likely that people will have to wait till DirectX 10 games to see the real benefits of Nvidia 8 series video cards!!!!



:lol:  By which time I may have found a buyer for the internal organs that I would have to sell to able to buy one! :lol: 
November 10, 2006 12:11:10 AM

Then you better get your eyes checked...I'm serious
Your a danger on the highway if you see no differences between those 2 pictures..
November 10, 2006 12:21:26 AM

Quote:
This is where people buy into the hype.......

A Nvidia 7 series video card can generate 60~70 frames per second for Game A ~

The new Nvidia 8 series video card can generate 100+ frames per second for Game A as well ~

People see these numbers and think ~~~ WOW!!!

But the problem is that during a game, anything above 40 frames per second will give you a smooth gaming experience!!!

Its likely that people will have to wait till DirectX 10 games to see the real benefits of Nvidia 8 series video cards!!!!


I agree, those cards are overkill for current games. If you however are planing on building a new computer, or just upgrading, you have to buy something that is overkill, so that it lasts for at least a year or maybe more.
November 10, 2006 12:40:42 AM

You know I am just waiting for everyone to bitch that the newest 8800gtx or 1000000 or whatever ati is going to call their new card cant run crysis at 1600x1200 and up with max settings. Everyone is still bitching about oblivian (which no I have not played I wont ever attempt it with my 1Ghz athlon and 512 MB of ram) not being able to run maxed out at what ever res the choose. Now I started the complaining about the coding of the game itself, I do not nor have I ever coded anything but of al the reviews I have read that deal with coding all say that oblivian was not a well coded game. Just deal with what you can get on what u can get it. If you dont run anything over 1200x whatever don't get a 600 dollar card.
November 10, 2006 12:52:17 AM

are some of u ppl stupid, no optimization running at dx9 and u want it to get 100fps in everything, remember bak when 7800gtx was beating x1900xtx cards? then optimization came along and ati rules the last dx9 generation for single cards

i dont think most ppl here want fps for the sake of fps, but if a game normally runs at 200 fps, when a lot of stuf happen on screen the chance of falling below 30 is low, thats my main reason of wanting high fps anyways

and about that gfx comparison of the dx10 card, the 8800 has a sharper image
November 10, 2006 1:14:27 AM

Quote:
Quote:

:lol:  By which time I may have found a buyer for the internal organs that I would have to sell to able to buy one! :lol: 


hmm, pm me about those organs mousemonkey :p 
November 10, 2006 1:39:00 AM

WOW... I can't believe people are ragging on the 8800 already. This card is designed to be a DX10 card... from a future perspective DX9 is probably not worth spending a lot more time optimizing. The architectural changes to this card are designed to push the whole graphics world forward.

Personally I don't need to see an extra few frames on the older games, the ones I play already are okay. But in the future I would like to be able to experience the improvements that are fundamental to DX10. I want to see efficient physics that does not hinder frame rates, I want to see more realistic characters and settings.

Am I an early adopter.... errmmm no. Will I get a DX10 card in the next few months... quite possibly. And it's because I wan't to see where the future of gaming is going.
November 10, 2006 1:47:24 AM

Quote:
are some of u ppl stupid, no optimization running at dx9 and u want it to get 100fps in everything, remember bak when 7800gtx was beating x1900xtx cards? then optimization came along and ati rules the last dx9 generation for single cards

i dont think most ppl here want fps for the sake of fps, but if a game normally runs at 200 fps, when a lot of stuf happen on screen the chance of falling below 30 is low, thats my main reason of wanting high fps anyways

and about that gfx comparison of the dx10 card, the 8800 has a sharper image


Remember, they are using BETA drivers now, and the optomized ones will be out in a short period, which should help the lower resolution screens. Seems now as the pure horsepower or grunt of the 8800 is helping the big screens look better, where they were already (current cards) maxxed out on smaller screens and could not push the big screens any faster.

Everybody should see improvement (big and small) in the next set of drivers.
a c 271 U Graphics card
November 10, 2006 1:52:25 AM

:lol:  Hmm, a response already, maybe I should think of an auction, e-bay perhaps?
November 10, 2006 1:56:18 AM

Eerr, you accept a loan or something?
November 10, 2006 2:02:36 AM

perfect example are consoles, are the best gfx out for launch games? hell no, the last few games are the ones that look the best
!