Sign in with
Sign up | Sign in
Your question

the new gtx's: not a leap, more like a tip-toe in performance

Last response: in Graphics & Displays
Share
June 16, 2008 5:51:54 PM

you guys checked out the reviews around the net? epic fail, was it? im entirely dissappointed with this new cards that came out. i just cant justify a single reason why someone would pick a 280 over a gx2. havent we bought enough nvidia cards for the past 2 years to deserve this? nv is so screwed right now :(  . . . sucky high end means sucky mainstream...no new ugprade for my 9600gt then!
June 16, 2008 9:27:49 PM

What makes me wounder is if its performance is nothing big over a 9800GX2 and a 3870X2 is pretty close does this mean a 4870/4870X2 will be the best card? And for the price it seems it wont even be worth an upgrade even from a 8800 series GPU.

Of course we will have to wait and see if a 4870 is truly able to do the same as a 3870X2. If so a 4870 should offer a lot more performance for the price, especially since it will probably be targeting the sub $500 range.

I guess the only good thing means if ATIs cards are better Nvidia will be forced to lower the prices to something more reasonable to compete.

One question I just thought of, why is it Nvidia does not think their next cards should support DX10.1? Or even SM4.1? Yet ATI has been supporting those both since the 3870 series? Shouldn't Nvidia just add support either way for pure competition?
June 16, 2008 10:34:15 PM

The 10.1 some say is because their architecture might not be able to support it without changing some things from the ground up.

Otherwise, there isn't anything CREDIBLE to support it. Until recently, I felt they were avoiding it because developers have said (at least according to NVidia) that they don't really care to use it.

So who knows, but the performance numbers of this new card sux0rz.

Even if I didn't have a previous card and this was my first build ever, I'd still go with a 9800GX2 if I can deal with the large amount of heat it puts into the system and don't mind a dual-gpu.

Otherwise, stick with the Ultra (or in my case, the 9800 gtx is more than I need right now and is quiet and cool as can be).

Maybe when they move these to 55 or 50nm, they might be able to be OCd and run better (factory or user).

I won't touch these for the noise alone... Small die would be cooler, quieter, cheaper, and most likely faster.

They better get moving ;l)
Related resources
June 17, 2008 5:32:28 AM

rumor has it that the 4850 will cost around 200$, it even performs better than a 260 and can beat the 280 in some benchies... first time ever i'll have an ati on my rig if turns out to be true.

i sort of expected more especially with jen hsun huang trash-talking intel and all they come up was this. lol. total garbage.
June 17, 2008 6:47:29 AM

wh3resmycar said:
rumor has it that the 4850 will cost around 200$, it even performs better than a 260 and can beat the 280 in some benchies... first time ever i'll have an ati on my rig if turns out to be true.

i sort of expected more especially with jen hsun huang trash-talking intel and all they come up was this. lol. total garbage.


You're basically dreaming if you think the 4850 is going to beat even the gtx260 in any benchmark

The 4850 has already been tested, its a good video card for its price point - but its not high end.
a b Î Nvidia
June 17, 2008 7:52:43 AM

ovaltineplease said:
You're basically dreaming if you think the 4850 is going to beat even the gtx260 in any benchmark


What do you want to bet on that one?
Cause I can think of 3 just off the top of my head (heck one of which probably the HD3870 beats it in).
And if the GTX280 loses out the the GF8800GTX from time to time, it's very likely we'll see some performance holes where the HD4850 could fill the gap.

Quote:
The 4850 has already been tested, its a good video card for its price point - but its not high end.


Didn't know we had any official reviews yet from credible sources. :heink: 
June 17, 2008 8:52:47 AM

Jimmy, keep your eye on the red team
June 17, 2008 12:49:32 PM

I have an 8800GT and the GTX 260 looks like a major upgrade to me. I don't see the point in comparing a dual chip card to a single GPU card. A 9800 GX2 should be faster than a single chip card. It didn't release that long ago. I expect the 4870X2 to be faster as well cause its a dual chip card. The 4870 won't be faster then the GTX 280 though. I'm hoping the 4870 can compete with the GTX 260 so maybe Nvidia will lower the price. I'd pick one up day one if it was $300. The extra RAM and larger memory bus will help my games. I play at 1920x1200.
June 18, 2008 3:21:45 AM

You guys are complaining? You get it good in the states, $650 USD is nothing for a high end card so stop whinging, here in Japan to get the GTX 280 it will cost me $800 USD and yes I am going to get it for my new rig, not because I like nvidia but because I have been waiting for the last month to put in a GPU into my monster rig and I can afford $800 for a nice single card solution.

I don't know why everyone is crying foul play here, I mean it is a single card, sli it or try sli it and it would clean the floor with anything on the market. What's the big deal, you cannot expect every new card to have double the power of it's last version.

I think what happened here is people set their expectations too high, hyped up by rumors of uber leetness and such then shot to pieces when it performed as it should for a new single card. Reality is sometimes like a shot of coffee after a hang over, hard and fast but at least you open your eyes.

I'll let you know how it goes, but I play on a 32 inch lcd and at only 1366x 978 so I will get all the bang I need from this.
June 18, 2008 4:00:46 AM

"you cannot expect every new card to have double the power of it's last version. " - Gargamel


Don't think anybody was expecting "double the power", but what we GOT from 280 is often times LESS performance than the GX2. I think expectations were high, but what we got was a step backwards, so there is pleanty of room to complain. With that being said, we are blessed with cheaper prices here in the U.S. and that is something we take for granted.
June 18, 2008 4:21:27 AM

TheGreatGrapeApe said:
What do you want to bet on that one?
Cause I can think of 3 just off the top of my head (heck one of which probably the HD3870 beats it in).
And if the GTX280 loses out the the GF8800GTX from time to time, it's very likely we'll see some performance holes where the HD4850 could fill the gap.

Quote:
The 4850 has already been tested, its a good video card for its price point - but its not high end.


Didn't know we had any official reviews yet from credible sources. :heink: 



There is no official review until the NDA is lifted, however:

xtremesystems tested the 4850, and it is great performance but its not beating the gtx260 - will the launch drivers make a difference? We'll see.

there is a german site which tested the 4850 in crossfire, and its excellent performance, but its still not killing crysis at 1920*1200*4xAA, as in 9 minimum fps and just below 30 max fps (4870 Crossfire might be Crysis stomping territory, we'll have to wait and see to find the truth on that)

Tweaktown did a gtx 280/280 SLI/Tri SLI test and they noted at the end of the test that the 4850 crossfire is really exceptional performance and that it might beat the gtx280 in some benchmarks, but he didn't get into specifics.

I think reality is going to show us exactly what AMD's themselves said: 4850 will be a little bit better than 8800 GT, which means 2 4850s will beat a gtx260 solidly and a gtx 280 by a small amount; 4870 will beat a 9800 GTX solidly, which means 2 4870s will beat a gtx 280 solidly by all rights as well


Factually grape, the 4850 on its own probably doesn't have the power to beat the gtx 260, but given that you can tentatively get 2 4850s for the same price as a single gtx 260, it should be the #1 bang for the buck graphics combo

Here is the problem in all of this though, I did a lot of searching, and I mean a lot of searching to try and find a quad crossfire board in a 775 socket that had space for 4x dual slot pci-e cards for quad crossfire like the 790FX from MSI - but I couldn't find one, which saddened me a lot because quad 4870s on an intel setup would be dandy.

Now what this means for me is this: as a current Nforce user, I can either switch to an AMD platform (which I don't intend to do), or I can switch to SLI gtx280s (which I might do depending on the 4870 benches with antialiasing enabled), or I can wait for 4870x2 to do a Quad Crossfire on a new socket 775 motherboard

Option 2 and 3 are both very expensive - option 2 is about 1252$ from my preferred e-tailer for dual evga cards, and option 3 will likely be 1270$ ish but it won't be an option until some time from now - that is the 4870x2 release

I'm actually not in a big hurry to upgrade my graphics cards - i'm waiting until at LEAST the 4870 has been benchmarked in Crossfire with antialiasing enabled to pass a judgment call for certain as to what I would intend to do.

I think the 4870x2 is going to be the top of the litter at the end of the year for sure - but there is a lot of time between now and then, and if the 4870s launch well it could drive the nvidia gpu price down a bit


Anyways, we'll see what happens yet =)
June 18, 2008 4:36:27 AM

I just think it is stupid comparing a single card GPU to a previous dual card solution and then saying it is a step backwards because the new single card doesn't outperform the current top of the performance scale dual card.

The 9800GX2 wasn't released to long ago and if the single GTX 280 stock is almost beating the dual card, then I think that is the card for me.

If you overclock the GTX 280 I think you will be surprised, another site compared single cards and the 9800GX2, while using a stock GTX 280 and an overclocked version.

In short, oc'ing the 280 got about 15% performance bonus, which ended in beating the 9800GX2 in half the tests.

Overclockers.net or something, go check it out, I got to teach now so I be back to enlighten you a bit more.
June 18, 2008 4:41:34 AM

Well, its not really stupid but there is a fact which has always been consistant to buying high end/mainstream/value

value purchases will almost always give you the best bang for the buck

mainstream is typically not the best value for money, but there are typically some savings

high end is generally the best performance and rediculously expensive


If I upgrade my gpu array of 8800 GT SLI, its going to be to something very high end, not to the next mainstream array because that makes no sense.

Anyways, we all know that top performance is not cost effective, but when you are buying top performance you are paying for that specifically, not for super value savings.
June 18, 2008 2:27:08 PM

I agree that cost performance wise the new GTX 280 is outrageously overpriced and will be a hard hit to those who are buying it. I just got a call from the import store specifically importing the parts for my new rig in Akihabara and the card, a Galaxy brand one, will cost me $800 USD. That was the cheapest from the four brands I was given to choose from.

Seen as I have already dropped over 3k into this rig already, the card will have to wait till next month I think, I am also considering the 4870 if it gives similar performance, but who knows.
June 18, 2008 9:44:23 PM

yup, my only real concern is whether 4870 has superior AA scaling performance by a noticeable margin - if it does then I might switch my platform or wait.
June 18, 2008 10:47:47 PM

Annisman said:
Don't think anybody was expecting "double the power", but what we GOT from 280 is often times LESS performance than the GX2. I think expectations were high, but what we got was a step backwards, so there is pleanty of room to complain. With that being said, we are blessed with cheaper prices here in the U.S. and that is something we take for granted.


A single chip card that performs on par with a dual chip card isn't what I'd call a step backwards. I think people forget what the GX2 really is. It's SLI without the extra PCB. If you want to compare evenly then compare a GX2 to 2 GTX 280's. Two GPU vs two GPU. Maybe it won't be so disappointing then. ;) 
June 21, 2008 10:41:29 AM

Your basing your information on early beta drivers and games that don't support physx , i have just picked up an evga gtx 280 and ran benchmarks of my own with new drivers lets say they were ALOT higher than ones posted on all the sites i have checked.

Just think most games do not function well with sli this is a single card that blows away a gx2 with new drivers with physx installed.

I see lots of posts saying who cares about physx only one game supports it blah blah blah sorry but your wrong physics is in every game benchmarks measure how well your gpu/cpu handles physics calculations and nvidia was right to grab physx because it is the future of gaming increasing the amount of physics calculations per sec means more fps in any game.

Now once all the game designers use physx ati will be left in the dust

How does this impact things like folding@home well my gtx 280 is getting 8000 irt/sec with physx installed without it i was getting 3800.

Without physics in game all you have is a screenshot..........
a b Î Nvidia
June 21, 2008 3:07:40 PM

techguy911 said:

I see lots of posts saying who cares about physx only one game supports it blah blah blah sorry but your wrong physics is in every game


No it's not.
Many games have physics, but it's not every game, and most of those that have physics use the Havoc engine, not the Novodex/PHysX engine.

Quote:
nvidia was right to grab physx because it is the future of gaming increasing the amount of physics calculations per sec means more fps in any game.


As long as it's not a unified API, it means lower fps not higher, because all it will remain is an add-on not a substitute for CPU usage, they can't make the game require it anymore than requiring a specific CPU or GPU.

Quote:
Now once all the game designers use physx ati will be left in the dust


If they decide to use PhysX, right now more use Havoc and intel & AMD being behind that solution means that there's little chance that it's going away anytime soon.

Until it gives people a compeling reason to use it other than tumbling rocks and making barrels & boxes fly around, physics will remain a sideshow.
June 22, 2008 1:18:01 PM

Quote:
No it's not.
Many games have physics, but it's not every game, and most of those that have physics use the Havoc engine, not the Novodex/PHysX engine.


Every game thats not tick tac toe has physics (math of moving objects) Every fps uses calculations as per when you fire your gun where the bullet will land, when you jump ie artificial gravity, the arc of that thrown grenade what it will bonce off of when it hits an object and so on.


Quote:
nvidia was right to grab physx because it is the future of gaming increasing the amount of physics calculations per sec means more fps in any game.

As long as it's not a unified API, it means lower fps not higher, because all it will remain is an add-on not a substitute for CPU usage, they can't make the game require it anymore than requiring a specific CPU or GPU.


It has been shown in new benchmarks that anything involving huge quantities of math calculations are done faster using a gpu rather than cpu and every computer already has a gpu.

I have done the vantage test without and with phsyx there is a noticeable improvement in score.



Quote:
Now once all the game designers use physx ati will be left in the dust

If they decide to use PhysX, right now more use Havoc and intel & AMD being behind that solution means that there's little chance that it's going away anytime soon.



Who said havoc would go away but this does and has proven do to the job faster.


Quote:

Until it gives people a compeling reason to use it other than tumbling rocks and making barrels & boxes fly around, physics will remain a sideshow.


Name one fps that does not use physics [math] calculations its not sideshow while it might not be useful in a game like tic tac toe any game with math calculations, explosions, moving objects, PARTICLE EFFECTS will benefit from the faster calculations that physx does.


June 22, 2008 2:52:50 PM

techguy911, the TRUTH that TGGA was trying to impart to you is simple. Most of the basic calculations are handled by the CPU, and the more complex ones are shared between the GPU and CPU. The thing is, the physics in games is what really uses CPUs, unless you have a P4 or AMD 2000+ you already have more physics processing than you need, without the help of Physx. Another factor for these calculations is RAM, the faster the RAM the more the CPU can do, as long as you have some decent DDR2 you should be fine. This is why the there is very little if any performance increase with Physx if you are running a decent CPU and RAM, it is very irrelevant, since what games can you name that are bottlenecked by the CPU?. Well, there are those ones that can not go higher 100 FPS because of the CPU, but since you can only see up to 60 FPS than that does not really matter.
June 22, 2008 5:00:30 PM

Quote:

This is why the there is very little if any performance increase with Physx if you are running a decent CPU and RAM, it is very irrelevant, since what games can you name that are bottlenecked by the CPU?.



Wrong i have already done tests to prove otherwise i have done benchmarks using 3dmark with and without physx the larger your resolution and more eye candy is turned to max makes a huge impact on image quality.

Also i have tested the following games:

Tom Clancy's Ghost Recon Advanced Warfighter
Tom Clancy's Rainbow Six Vegas
Unreal Tournament 3: Extreme Physics Mod
Mass Effect
Warmonger



Not only does it improve on the reality of blowing up walls and such but any game with particle effects maxed at max res will bring any rig to its knees.

and results MUCH higher cpu score in 3dmark on average.

Both cpu and gpu on gtx 280.

Try running crysis with everything maxed at max resolution that game supports.

Here is one test check out the cpu test scores 8800gt......

http://forums.overclockers.com.au/showpost.php?p=9001204&postcount=99

The results using a gtx 280OC:

http://forums.overclockers.com.au/showpost.php?p=9007601&postcount=171

Another post that explains exactly the point im trying to get across written by another user who has also experimented with this.

http://forums.overclockers.com.au/showpost.php?p=9002605&postcount=125

What games with physx support can do:

http://www.hothardware.com/News/NVIDIA_Adds_the_9800_GTX_and_PhysX/

By a user using an 88000gts 512 using physx drivers:

Quote:
Yes my performance did increase dramatically in UT3 (unplayable at about 5-10 fps compared to a reasonably constant 25ish FPS)


The idea that offloading more of the cpu's physics calculations to a gpu is what nivida had in mind the gpu can do sooooooo much more than the original physx card.

Same goes for video conversions the gpu can do it so much faster that its no contest, nvidia supported badaboom should be out soon.

http://www.tomshardware.com/reviews/nvidia-gtx-280,1953-24.html

Oh and the bench marks that were done on the gtx 280 were done using beta drivers , i have done tests using the new leaked drivers and they are MUCH faster.

When i get a chance ill post benchmarks of my evga gtx 280 with same games other sites tested plus ut3 without and then with physx.
a b Î Nvidia
June 22, 2008 8:30:23 PM

techguy911 said:
Yea well you get back to me when i see a vantage score of more than 10,282 from ati.


You get back to me when Bungholiomarks matter to anything other than within their own hardware.

Quote:
I have done the vantage test without and with phsyx there is a noticeable improvement in score.


Whopee, higher bungholiomarks scores, I'm sure that will make playing 3Dmark Vantage more enjoyable. [:thegreatgrapeape:5]

Quote:
Who said havoc would go away but this does and has proven do to the job faster.


No it hasn't, sofar it's proved to do it differently on different tasks. Havoc has FX and the same option, but we've never seen like versus like, even in the titles you mention the core Physics is done by another engine (in GRAW it's Havoc's in UT3 is Epic's own engine) and the PhysX in those games is limited to demo level that simply add unrealistic features, not something where the physics is throughout the game affecting the gameplay.

Quote:
Name one fps that does not use physics [math] calculations its not sideshow


Name one compelling game of the level of UT3 or even GRAW that uses it as the core physics and not as a sideshow demo with an islland or single level 'chock full O' physics'.

Quote:
Wrong i have already done tests to prove otherwise i have done benchmarks using 3dmark with and without physx the larger your resolution and more eye candy is turned to max makes a huge impact on image quality.


BS, since PhysX isn't processing any of the image quality that's an example of you thinking you're seeing something when nothing is there, or are you now saying that PhysX not only does physics but is now handling DX code and doing AA?
!