Sign in with
Sign up | Sign in
Your question

GTX 280 will be faster than HD 4870/4850

Last response: in Graphics & Displays
Share
June 17, 2008 2:55:30 AM

However not by much if this is correct http://www.eetimes.com/news/latest/showArticle.jhtml;jsessionid=TIWT24HVFWPGMQSNDLSCKHA?articleID=208404063

Quote:
AMD says its 4850 device at about 110 W and $199 will deliver about 75 percent of the performance of Nvidia's high-end GTX280 which costs $649 and dissipates 236W. Two of the AMD parts on a board will hit graphics benchmarks about 30 percent higher than the Nvidia device, the AMD spokesman added.

A 4870 version of the product sporting slightly higher performance will cost $299. Both AMD chips are made in a 55nm process, compared to 65nm for the Nvidia chips, and measure about 16x16mm compared to about 24x24mm for the Nvidia part. The approach gives Nvidia bragging rights for the single most powerful graphics processor, an edge that plays well to the gaming and technical communities who use the parts.


That said if this is true, the 4870 x2 should be interesting, I can't wait for the reviews of 4800 series.
June 17, 2008 3:02:52 AM

I find it odd than Nvidia has been making these "oohh 30% faster, 3 times more, better, faster", but in the legit benchmarks/reviews that came out today from Toms and Andantech its the TOTAL opposite in everything.

And im not buying this "card for the future crap" either. Technology, escpially in computers advances so fast that building a high end enthusiast video card "for the future" is utter bullcrap and a total waste of time, effort, and money. Nvidia themselves will release something better a year -/+ from now, not to mention the 4870X2 and Larrabe if you believe that.
June 17, 2008 3:06:42 AM

Spathotan, True the reviews today kind of shocked me. I was sure Nvidia was sitting on something huge, due to the "crossfire" they have been in with Intel. They made a monster of a GPU, but it kinda fell short of the mark. And the "Card for the Future" is just marketing crap, the only card for the future is the next one they want to sell you.
Related resources
June 17, 2008 3:16:45 AM

I bet some driver updates will help out but yeah, like usual, it didn't live up to the hype. I bet ATI is holding the 4870 for some last minute tweaks now that the 280/260 is out and about.

June 17, 2008 3:18:32 AM

Do you guys remember what ATi did for the 2900xt? Well change that to Nvidia and the 2900xt to the 280gtx. The promised so much but delivered a major disappointment.

I wonder if Hector Ruiz is working with Nvidia these days?
June 17, 2008 3:25:05 AM

This is one time that ATI being a week late might get a big payoff. They now know how bad Nvidia flubbed things, so when they bring their new card out, they can talk about how nobody does it better than they do it. Even if the 4870 doesn't perform better, they still have the price edge. Buy two 4870s for the same as one 280 GTX, and get some change back. Sounds good to me.
June 17, 2008 3:31:30 AM

gadgetnerd said:
http://www.legitreviews.com/article/726/15/

i dont think so


I dont think so either. A vendor is not going to make a 2000 point difference in 3Dmark. The EVGA one scoring 2000 +/- higher than the PNY one is BS. It would take an overclock of like 400mhz to do that. Not to mention is like a $1000 card, the waterblock card shouldnt even be tested with standard units.
June 17, 2008 6:32:21 AM

I'm just worried that the 4870/4850 are overhyped a little.

I reallt don't know, but I don't really see how ATI can produce a $300 card that can almost compete with a $649 dollar card. It makes me think they'd raise the price at least. Like at least $50. Unless they're really trying to get as many customers as possible.

I'm just afraid that we may be a little dissapointed when the 4800s release as well.
June 17, 2008 6:43:53 AM

Seeing as the 9800GX2 can perform sometimes faster than the GTX 280 at lower resolutions and AF/AA then I think a multi-gpu like the 4870 X2 can kick the GTX 280 easily if it's got the processing power and Vram. "2 brains is better than 1" unless they conflict with each other, kind of like how tri-sli creates too much overhead for the cpu and performs less than regular sli.
June 17, 2008 6:45:54 AM

280GTX should have been 2x the performance of G92 but it fell short with only 80TMU vs 64TMU on G92. All AMD has to do is release with a card that is little faster than G92 with a much lower price tag and they win. AMD fixed their AA so Nvidia won't have the edge anymore when it comes to AA.
June 17, 2008 7:15:19 AM

marvelous211 said:
AMD fixed their AA so Nvidia won't have the edge anymore when it comes to AA.


Did I miss something? When was this?
June 17, 2008 8:11:55 AM

I doubt that nvidia will have an answer for the 4870x2 if and when it arrives.
June 17, 2008 8:50:54 AM

yadge said:
I'm just worried that the 4870/4850 are overhyped a little.

I reallt don't know, but I don't really see how ATI can produce a $300 card that can almost compete with a $649 dollar card. It makes me think they'd raise the price at least. Like at least $50. Unless they're really trying to get as many customers as possible.

I'm just afraid that we may be a little dissapointed when the 4800s release as well.


A big difference is that ATI has stated that it didn't expect that its new card would be faster than the 280 GTX. If anything, its the enthusiasts who have overhyped it. ATI has merely taken the stand that it will be better than the 3870. As for the idea of a $300 card competing against a $650 card, there should be no competition. The problem is that Nvidia's new card is too expensive to produce at a middle price range and it ended up not performing as expected. That's where the disappointment is. Nvidia was talking the 280 as being the greatest thing ever, but reality shows that it leaves the question of "Why bother?" So no, a $300 shouldn't be competing against a $650 card, its more that Nvidia's $650 card should be costing about $300.
June 17, 2008 8:59:26 AM

If what people are saying and displaying in this other forum is true. Then ATi may be holding the next 8800GTS card. And call me stupid, and i read the entire thread, but what was the problem with the AA in ATi's cards before now?
June 17, 2008 9:41:48 AM

Wouldn't mind 2x 4870x2's
June 17, 2008 9:45:35 AM

sailer said:
So no, a $300 shouldn't be competing against a $650 card, its more that Nvidia's $650 card should be costing about $300.


I just meant that I'm worried that ATI might take advantage of nvidias position and raise the price of the 4870. If the performance claims so far are true, then they could still get away with selling it at $50 or so more.

I guess I'm just wary because of the overexpectations of the 2900's and now the GTX 200s. I really hope the 4800 do awesomely. I prefer to be prepared to be dissapointed I guess.
June 17, 2008 11:20:17 AM

yadge said:
I'm just worried that the 4870/4850 are overhyped a little.

I reallt don't know, but I don't really see how ATI can produce a $300 card that can almost compete with a $649 dollar card. It makes me think they'd raise the price at least. Like at least $50. Unless they're really trying to get as many customers as possible.

I'm just afraid that we may be a little disappointed when the 4800s release as well.


I have never agreed more. I would love a surprise from them though, but I think that more people are excited about the price than anything else. If the GTX280 were $500 then it would be a decent buy, but its not so that card is reserved for a smaller niche crowd than the PC enthusiast market already is.

Best,

3Ball
June 17, 2008 12:07:45 PM

Quote:
I have never agreed more. I would love a surprise from them though, but I think that more people are excited about the price than anything else. If the GTX280 were $500 then it would be a decent buy, but its not so that card is reserved for a smaller niche crowd than the PC enthusiast market already is.

Best,

3Ball


Honesty I would love to have the fastest green "ship" in the universe that said, I would settle for two of the other red "ships", the ones that never looses sight of that green "ship" for about half the price. ($199+$199 = 130% performance if the original article is correct)

Outside of metaphors I am not expecting anything, I just thank its weird that AMD/ATI would move their date back seeing that have already stocked 4850's and shipped them to OEM's, However it would make sense if they did it, so that they could release both the 4850/4870 on the same day.

Furthermore if the 4870 comes in close to the 280, I would not expect to see the 4870x2 until the 55nm revamp of the 280. However as you said 3Ball I would love for AMD/ATI to surprise me, I am just going to wait till 06/25/08 like you, to make my mind up. I will decide then if this 8800GT is going to last another generation.
June 17, 2008 12:10:49 PM

ati performs better in what we call "overall score":

let me explain:

its not only the fracking 3d mark points and few fps games than will deside wich card worths and the rest are for garbiges.
NO

come to your sences and think logicaly my friends.
we have for eg. (numbers are not actuall results)

3Dmark 06 and few FPS and RTS games results in FPS
resolution 1280x1024 (typical all use resolution, we are not video producers to use 4000*9000 resolutions)
so..

Geforce 8800GT /stock clocks: lets say 88FPS
and
Radeon 3870 /stock clocks: lets say 80FPS


do you really belive if you play in a computer the same game
in the 1st monitor you get 88fps and in the other pc 80 or even 70fps, do you really think your mind, eye, sences, will notice the difference???
NO WAY.-

now the pcs are exaclty the same, even the cases and fans ok?

now think (or remember cause many of you i am sure you have live expireances) while we run those test we have a tempereture and noise test at the same time, its night and you wanna play a game, or even worst its summer...

pc with 8800gt gives you at least +20c more in temps so fans Both CPU and vga will have to spin Faster, Faster=More Noise (DB)

NvidiaPC total temp out of the back of the case eg: 60c / 50DB noise ---250$
AtiPC total temp from the back of the case eg: 45c / 35DB noise -----150$
not to add the power consumsion....
and the Huge price difference sometimes!



Now ask the same question as before:

would your sences understand those differences in temps and in Noise and your pocket when spending 650$ instend for 300$????
hell YES!, for sure!.-

conclusion ATI priducts are "Better" in RealLife applications and Everyday friendly, than Nvidia's.

as they said in TopGear, having a super lambo so close to the ground, so hard in the suspensions, and so loud in engine, will make you feel Horny in a track, yea....If you leave in a truck!

now take a porche, very fast,higher from the ground, more quite, able to use everyday in your life!

who cares if the super lambo gives me top speed 380kph and the porsh 360kph but i cant use it everyday???

tnx for your time.-

Johnnyxp64
June 17, 2008 1:07:36 PM

johnnyxp64 said:
ati performs better in what we call "overall score":

Johnnyxp64


Johnnyxp64, yes you are correct on most of your points, however the difference in FPS does make a difference if you are talking about 24 fps and 34 fps. That said Nvidia does sometimes degrade images to increase fps, and that I don't agree with. As far as ATI goes they are no better, if you remember correctly they use to cheat on their AA and Ansi quality, either company has in the past done what ever is needed to boost their numbers.

That said I have been impressed with ATI/AMD and their 3870/3850 as far as the quality of the cards and the features present, the performance was good and if the 8800GT and 9600GT didn't drop in price so fast it would have been a killer Price/Performance card. That said if AMD/ATI raises the bar in performance (within 15% of the high end) and keeps the pace with their last generation’s quality (i.e... price, features, lack of perceived cheats "FPS or Quality") then they should be considered "Overall" better.
June 17, 2008 1:17:06 PM

johnnyxp64 said:

as they said in TopGear, having a super lambo so close to the ground, so hard in the suspensions, and so loud in engine, will make you feel Horny in a track, yea....If you leave in a truck!

now take a porche, very fast,higher from the ground, more quite, able to use everyday in your life!

who cares if the super lambo gives me top speed 380kph and the porsh 360kph but i cant use it everyday???


Thank god!!! I thought I was the only person who watches that show!

I do agree that for the whole picture, ATi has a nice solution that gives you great graphics in general use or for normal games. With high end games... not so much. But it makes up for it on your energy bill, heat, noise, and wallet. Although I am not expecting the 4870 to be right up with the 280gtx, I am expecting it to be close. I am thinking about going 2x 4870's. Would that equal about the same gains as a 4870x2?


And on that bombshell, we'll see you next time!
June 17, 2008 1:28:50 PM

Phrexianghoul said:
If what people are saying and displaying in this other forum is true. Then ATi may be holding the next 8800GTS card. And call me stupid, and i read the entire thread, but what was the problem with the AA in ATi's cards before now?


Quote:
4800 series to not use shaders for FSAA
--
We've learned that RV770XT and Pro won't do Full Scene Antialiasing via Shaders. Developers hated it, but RV670's design flaw was responsible for it.

RV770XT and PRO will end up significantly faster, especially with FSAA compared to RV670 generation. The numbers should be higher than you usually see from generation to generation, but we don’t know more details about it at press time.

RV670 was improved and fixed R600, or should we say R600 was done right, whereas RV770 is an improvement of the stuff that ATI has implemented.


The FSAA problem goes back to DirectX 10 and the features cut by Microsoft so that Nvidia's 8800 series would be considered DX10 compatible. ATI did the right thing and designed their chips to use shaders for FSAA as Microsoft had instructed in the original plans for DX10 aka DX10.1. However they got caught in no mans land when the rules changed; as such they had poor performance in AA due to having the shaders do the FSAA then having them done again in a second pass as Nvidia’s cards do.
June 17, 2008 1:31:01 PM

@CSA_Myth: i agree but you missunderstand something..

i nevered talked in my example about noticable difference between 24 and 34fps, everything UNDER 55-60fps is ?Noticable by the human sences!

so i was talking that we take for granted that in the selected resolution both cards performce much more than 60fps, so there is no difference!

and we are talking about the majority of gamers, who uses resolution above 1600? if not for professional job? i quess none.
to make the card perform so litle fps 30-20.
if you are a professional, then you dont care about energy-noise-money-and you buy the faster card even if its only 1% faster than the second card.


as about the "cheating" issues, both companies....well, nvidia had the lead in there, they were cheating extreamly, But thats my Point exactly...

i dont give TOO much emphasis in the software-bechmarks scores cause they are altered by both companies and they do Not represent Real Life needs!

i am very very sure in this cause i am not talking in theory, i have changed and tested in real life for more than a month the following products, the last year.

Asus 8800GT 512
Gigabyte 8800GT OC 512
ati 3870 512
saphire 3870 OC 512

and allways with the latest drivers-directx-games-patches-os hotfixes multicore bla bla bla

and i may see a 3-5% difference Nvidia>Ati in bechmarks BUT in actuall gameplay (1440/900/32) there was No Difference at all and trust me i am very expireansed in IT for more than 12years.
fras may display 90fps in nvidia and 80fps in ATI
but when you close it and forget the sycological impact those Stupid numbers have on you, and just play, then you dont understand any difference! only that you paid almost 50% less to get that "red" ship as they said above, that never losses the "green" ship infront! ;-)

its all about sycology and stupid marketing.

in our days even a Ati 1950XT in 1440/900 can handle in High 99% of the games out there!

there is no point to upgrade all the time and feel 1% faster if not at all, while you have spend 600$!, thats just stupid! give it to a poor guy, or buy an other pc for downloading and office use!
June 17, 2008 1:41:46 PM

spaztic7 said:
Thank god!!! I thought I was the only person who watches that show!

I do agree that for the whole picture, ATi has a nice solution that gives you great graphics in general use or for normal games. With high end games... not so much. But it makes up for it on your energy bill, heat, noise, and wallet. Although I am not expecting the 4870 to be right up with the 280gtx, I am expecting it to be close. I am thinking about going 2x 4870's. Would that equal about the same gains as a 4870x2?


And on that bombshell, we'll see you next time!


well you are Not alone! :hello: 

who said ati loses in high end games? eh?
you give what 625$ to get the 280gtxtdferw to give you heat like sun, noise like air-jet, to get what 120fps in 1600x1280x32
when if you are gona spend that money you give 600$ and get 2 super Ati 4870 babes, that will give you 200fps in those resolution and get some money back for a bear :p 

ATi is in difficult solution in stock market after the damage by intel, but last month is going really good and starts again to fight back! they need to sell but doing it so in a Smart way not in harry!

1)low price products that give ALmost what the most expensive one can.
2)easier to build a multiGPU system, cheaper and Much faster in the end from its enemy. SLI cost extreamly too much (i was having once 2 years ago with 2x7800gt) when the new crossfire Kicks ass and can make your multiGPU dream come true, and performe Better than SLI in the High games :) 

Ati has the Right solutions in the right price in the right place, not so sure about the right time though..... (sometimes they are getting late, but you should NOT be harry to buy enything new that a company sells untill you see the next months what the other comany has to offer!)


huge p.s
i disagree with the movement for AMD/ATI to work with Havok for their physics engine (38xx are compatible as 88xx)
because Havok is owned by Nvidia, they will NEVER give the best and high end technology in ATI cards as they will in Nvidia, i am afraid, BUT they are gona kick Intell's physics with there AMD Phenoms CPUs... :D 
June 17, 2008 1:57:05 PM

Quick question for you Johnny cause I think you're speaking to the real majority here with your logic...if playing games on a 19 inch with 1280x1024 max resolution...what graphics card will do the job with todays modern games ?...is it necessary to go sli ?...would a good middle of the road card be enough to give you good gameplay at that resolution ?

Seems like people with the biggest screens and 10 card sli set ups are just chasing the extra 1 fps for a lot more money...

a b U Graphics card
a b Î Nvidia
June 17, 2008 1:59:59 PM

I think ATI could sell there 4870 for a little more, but why would they want to. This way they gain market share and hurt NVIdia. I believe I heard the NVidia chip is the biggest TMSC has ever made and thus is very expensive. They can keep prices low and still make a profit AND really give NVidia a hurting. It's win win for them. From what I've seen the ATI should be a decent performer, no miracles but good enough (for me at least).
June 17, 2008 2:15:48 PM

Foghorn said:
Quick question for you Johnny cause I think you're speaking to the real majority here with your logic...if playing games on a 19 inch with 1280x1024 max resolution...what graphics card will do the job with todays modern games ?...is it necessary to go sli ?...would a good middle of the road card be enough to give you good gameplay at that resolution ?

Seems like people with the biggest screens and 10 card sli set ups are just chasing the extra 1 fps for a lot more money...


thanks for understanding my point. :) 

Today for those needs: 17-6-2008 you should buy this superb Overclocked ati3870 Sinlg Slot:
http://www.e-shop.gr/show_per.phtml?id=PER.510653

costs only in my country (Greece) 117euros!

something Allmost equal to it its the 8800gt HT by Gigabyte, but those thermaltake fans maybe problematic, as it was twice mine! its also 2slot card, but the product quality its very very good, solid capasitors,lower energy smaller in size. BUT slower Clocks, more heat than the 3870, and noise!
June 17, 2008 2:31:26 PM

Bravo Yianni...
June 17, 2008 2:45:38 PM

johnnyxp64 said:
@CSA_Myth: i agree but you missunderstand something..

i nevered talked in my example about noticable difference between 24 and 34fps, everything UNDER 55-60fps is ?Noticable by the human sences!

so i was talking that we take for granted that in the selected resolution both cards performce much more than 60fps, so there is no difference!

and we are talking about the majority of gamers, who uses resolution above 1600? if not for professional job? i quess none.
to make the card perform so litle fps 30-20.
if you are a professional, then you dont care about energy-noise-money-and you buy the faster card even if its only 1% faster than the second card.

..........................................................................................................................................................................

in our days even a Ati 1950XT in 1440/900 can handle in High 99% of the games out there!

there is no point to upgrade all the time and feel 1% faster if not at all, while you have spend 600$!, thats just stupid! give it to a poor guy, or buy an other pc for downloading and office use!


johnnyxp64, no misunderstanding I see where you are coming from and totally agree, however I do run into games and apps that dip into those numbers at 1680X1050. For the most part 1680X1050/1600X1200 is not as demanding as it once was due to the power of today's cards but it still can bring them to their knees depending on the game (Crysis/Mass Effect) or the app (Renderman/Maya).

I mostly use my GPU for games, Maya, some CAD work, and 3DSM. Most of the time it doesn't make much difference in what card you are using because most of the time they are all fast enough, but there is cases where a bit more "oomph" would be nice. Honestly I was using a 6800GT until recently (seeing there was not enough of a performance difference to justify upgrading till the last generation); I upgraded to 8800GT and would not be all that interested in an upgrading again, if not for a few key points.

1. Gigabyte screwed the pooch on the remodeled 8800GT buy placing some crappy memory on the card, which they overclocked so much that it made the card unstable.

2. Getting the card stable and getting the heat to drop to a normal level required me to underclock the card (Cooling is not an issue - in my case 2 120mm fans for intake and outtake as well as a card blower sitting below the PCI-E slot, ambient temp is 38C).

3. Still waiting on a RMA request from Gigabyte to be approved, and I am about to the point of giving up on it.
4. Decided this card can go underclocked into my wife’s PC to upgrade her old but reliable ATI 9800pro.

Some background… I have been doing IT close to twenty years and I am currently the senior computer system analyst III at Lockheed Martin. I remember my first video card was a trident 256k and used the old IBM VBE interface, and I still have fond memories of Matox Number Nine series, TNT 2’s and Voodoo 1’s. There is no question in my mind that you know your stuff johnnyxp64, and I am not here to disagree with you because I 99% agree with you. That 1% well -
Quote:
i disagree with the movement for AMD/ATI to work with Havok for their physics engine (38xx are compatible as 88xx)
because Havok is owned by Nvidia, they will NEVER give the best and high end technology in ATI cards as they will in Nvidia, i am afraid, BUT they are gona kick Intell's physics with there AMD Phenoms CPUs.


Havok is owned by Intel and right now Intel wants to lay some heavy wood on Nvidia, so I am sure for the right fee they will give AMD/ATI whatever they want.
June 17, 2008 3:45:24 PM

@CSA_Myth:

ofcaurse i dont say that you are here to disagree with me, but disabgreement its all about dialog and talking so we and other users can come up with a common conclusion.

1st i made a mistake about havok in my rush, i was confused with PhysX that was owened by Nvidia.
So Yes the Intel with some price will help ATI/AMD to "lay some heavy wood on Nvidia" LOL
sure they want that, but is not going to happen the same 100% with the AMDs CPU physix capabilities vs Intels, but this is an other chapter that we should not start here :) 

2ndly you are the perfect example of a real life person, that uses a vga more profesional than gaming, like maya, cad etc, so i agree with you you need that little extra boom, if you have "ways" to keep away the heat. and also if is worthing the price!fine by me. but for people like FogHorn Above, no need to get a 280gtx for 625$!!! just for games! its Insane! even if you have the money, so what? buy 2x 4870(if they come out as we wish :p ) and kick the 280gtx goodbye.

3rdi love gigabyte products, espacially Mobo, energy control, powerfull, lots of stuff and overcloack, and all those in very good prices alwys! So that why i bought (my last one now) the
http://www.e-shop.gr/show_per.phtml?id=PER.516151
but when i putit in my pc and login my card was so noise like a Lockheed Martin Not stealth airjet! you should know about noise :)  (nice job you got there- i am a Software engineer now)
the ram was still gddr3 when Ati uss in the above product i said Gddr4, and after 3 RMAs i finally got one that works Perfectly!(i was unlucky)
BUT as you very nicly notice its that gforce reduce engine and memmory clocks to keep it cool, that what gigabyte also does, BUT they dont use Fan Controller in those Thermaltake.

i need the vga and for some 2D jobs were engine and ram speed Also Counts when you apply filters etc.
i have notice a difference running the card undercloacked and running it all the time at the 3d speeds!

so i upgraded its bios with a new one with the overclocked speeds in both 2d/3d and i have it 3 weeks now, fan works with 7volts sound like f117 in stealth mode :)  NOT 12v, and i also use 1 120mm fan infront 1 120mm fan back in my case, so i keep my system temp arroun 39 too, and the temp the Gigabyte gives me is ~52c Idle 2D and ~68c extreamly Full all / games and rendering. when other stock 8800gt gave me 88c at full or even More!!!!

conlcusion, i had great support for gigabyte here, after 2-3days they replaced it.gigabytes 8800gt ht is far better for mee in guality than the rest with stock fan.
but i did all those with 1 saphire 3870 too and the system was cooler -10c (at least) and i notice in rendering times 1-2min slower. in game i feel no difference.I am waiting now ATi to release the 4780 series and i will buy 2x3780 oc editions from saphire even cheaper than now and i will have them in Crossfire that will make gf280xt "of the future" to feel useless.
or maybe this one? what do you propose?
http://www.sapphiretech.com/us/products/products_overvi...

i am crossing my fingers ATI will keep making there drivers better, Nvidia is Full of BSOD nv4_disp.dll so dont blame ATI all alone for performance issues, at least there cards are running!
and i am hopping as you said for the physics support untill next year in 38xx & 48xx cause i HATE ponopoly!!!


so i dont think we even disagree in this 1% ;-)

June 17, 2008 3:47:03 PM

I don't understand how Nvidia didn't up their texture address and filtering when they reworked their SP with GT200. The old G92 cores had 8 texture address/filter for every 16SP but GT200 has 8 texture address/filter for every 24SP. Those textures did a whole lot more for games than just higher SP clocks with G92 with modern games.

GT200 has the same 8 by 8 texturing ability just like G92 but only 10 clusters of SP which equals out to 80tmu. That was the biggest difference when comparing G92 vs G80 and why G92 was able to beat it in lower resolutions or get very close to high resolution with much lower memory bandwidth. If they did 12 by 12 which would be the exact same number as G92 SP/texture ratio it would have 120 tmu instead of 80. GT200 is inferior far as texturing ability when you compare ratio to G92. Let's just hope Nvidia's refresh would fix this issue.
June 17, 2008 5:17:30 PM

I think Nvidia is playing with there fan base,and thats not cool.Since the 8800s there leaps ave been more like hops,when AMD is more consistent with ther moves ,realizing what they need to improve on and fix it.Its hard being No.1 for so long then to lose it.I dont see AMD caring to be No.1 more than making money and making money means giving the customers a better product.For me there is no doubt that 2 4850 will take you very close to the GTX280 and 2 4870 will crush the GTX280 for roughly less money,now if my guess is right then AMD will hit the p/p that they are loking for, we will buy.
June 17, 2008 5:44:27 PM

johnnyxp64 said:
i disagree with the movement for AMD/ATI to work with Havok for their physics engine (38xx are compatible as 88xx)
because Havok is owned by Nvidia, they will NEVER give the best and high end technology in ATI cards as they will in Nvidia, i am afraid, BUT they are gona kick Intell's physics with there AMD Phenoms CPUs... :D 


HavoK is not owned by Nvidia. It is owned by Intel. Nvidia owns Ageia and for ATi to take part of the Ageia engine, all they would have to do is use CUDA. I heard that CUDA is free as well. Weather or not ATi made the right choice by using Havok instead of CUDA/Ageia is yet to be known. There are many people who think that Havok is a better solution, like Valve or the majority of other game developers who have been using the Havok physics engine for many years.

It would be nice to see the Ageia core or any other dedicated hardware used on all cards as a standard for physics... but I do not see that happening any time soon.
June 17, 2008 6:17:22 PM

EXT64 said:
I think ATI could sell there 4870 for a little more, but why would they want to. This way they gain market share and hurt NVIdia. I believe I heard the NVidia chip is the biggest TMSC has ever made and thus is very expensive. They can keep prices low and still make a profit AND really give NVidia a hurting. It's win win for them. From what I've seen the ATI should be a decent performer, no miracles but good enough (for me at least).


I'd agree with that as well. Rumor is yields have been outstanding for all r7xx's for several months and they can pump them out at low cost. Why risk cutting the legs out of your market share by charging more? Also, I can't see more than $100 difference between the mid-range enthusiast (4850) and high-range enthusiast card (4870).

My main question would be what will be the difference between 4870 512mb ddr3 vs. 4870 1024 ddr5?
June 17, 2008 6:59:30 PM

marvelous211 said:
I don't understand how Nvidia didn't up their texture address and filtering when they reworked their SP with GT200. The old G92 cores had 8 texture address/filter for every 16SP but GT200 has 8 texture address/filter for every 24SP. Those textures did a whole lot more for games than just higher SP clocks with G92 with modern games.

GT200 has the same 8 by 8 texturing ability just like G92 but only 10 clusters of SP which equals out to 80tmu. That was the biggest difference when comparing G92 vs G80 and why G92 was able to beat it in lower resolutions or get very close to high resolution with much lower memory bandwidth. If they did 12 by 12 which would be the exact same number as G92 SP/texture ratio it would have 120 tmu instead of 80. GT200 is inferior far as texturing ability when you compare ratio to G92. Let's just hope Nvidia's refresh would fix this issue.


Interesting thoughts, I'm sure it came down to the transistor budget. GT200 is big and expensive enough as is, imagine another 40 TMUs on that die?
June 17, 2008 7:48:22 PM

badgtx1969 said:
How about 2 4850s n CF?

I think that beats a GTX 280!


well no need to "think if" that is 1000% for sure will happen, because Even today in some bechmarks 2x3870 kicks ass in "tommorows" GTX 280 :p  and in all tests those 2x3870 are 1-10fps Behind from the GTX 280!







only 9fps back from a card that is 2years old!


some physics too...


dont forget the noise and temps


dont get exited about the gtx 280 low temps those are because the downcloack the card too much!
and 2x3870 produce a little more without downclock!. still its very clear that ATI runs more smoothly.


now imagine what 2x4850 or even better 2x4870 can do! :) 

Ati's ways its smart, 2cards against 1 that cost double and more, and doesnt offer much :) 
June 17, 2008 11:41:19 PM

badgtx1969 said:
Interesting thoughts, I'm sure it came down to the transistor budget. GT200 is big and expensive enough as is, imagine another 40 TMUs on that die?


We weren't that Shader limited to begin with. If I was part of nvidia's engineering team I would have went to the get performance now kind of approach instead of trying to change how developers program games. There's no doubt that more games are using shader than ever but a full G92 was more than enough. A good 50% increase would have done the job for SP instead of trying to squeeze all that processing power into GT200 making the die size of silver $1 bill.

The lack in texture throughput is showing GT200's weakness. It has lot of processing power and bandwidth but games today offloads to textures and back to memory for the most part. When developers change the direction Nvidia can follow suit. Making people upgrade their video card more often and more make more money. Kind of deceptive for the misinformed public but that's how business should run.
June 18, 2008 2:34:54 AM

CSA_Myth said:
Johnnyxp64, yes you are correct on most of your points, however the difference in FPS does make a difference if you are talking about 24 fps and 34 fps. That said Nvidia does sometimes degrade images to increase fps, and that I don't agree with. As far as ATI goes they are no better, if you remember correctly they use to cheat on their AA and Ansi quality, either company has in the past done what ever is needed to boost their numbers.

That said I have been impressed with ATI/AMD and their 3870/3850 as far as the quality of the cards and the features present, the performance was good and if the 8800GT and 9600GT didn't drop in price so fast it would have been a killer Price/Performance card. That said if AMD/ATI raises the bar in performance (within 15% of the high end) and keeps the pace with their last generation’s quality (i.e... price, features, lack of perceived cheats "FPS or Quality") then they should be considered "Overall" better.


Both ATI and Nvidia are supposedly fudging 3DMark Vantage right now, but overall, ATI hasn't fudged a driver for a demo in over 5 years. Nvidia fudged the Crysis demo water which sold many cards once that game arrived. When they fixed the water, their framerates dropped closer to that of ATI's cards.

ATI has never done anything like coercing a company to get a patch out to hamper the performance of the other company's card. Shame on both Nvidia and Ubisoft for that. In fact, I'm boycotting Ubisoft right now because of it, though I love the HOMM and Might and Magic franchise they bought for a song when 3DO bit the dust.

Overall, I prefer ATI's image quality too, though Nvidia's caught up and I admit that Nvidia does better with on AA and AF turned on at higher resolutions. 34 fps vs. 24 makes a difference in a game like Crysis, but in most cases, it's a matter of 70 fps vs. 60, or something similar. Sometimes it's even less.

AVIVO plus image quality decides ATI for me, and I can see Folding @ Home deciding Nvidia for someone else, but overall, the GTX280 is a hound dog that just doesn't hunt. It loses to the 9800gx2 in many cases, with the 3870x2 in third place. Maybe a die shrink will change things for Nvidia but right now, they seem to be in the same situation vis a vis G200 as AMD is in regards Phenom, with the caveat that they lose to their old generation, barely beat ATI's old generation and are in an iffy situation vis a vis ATI's next generation.

Next week should be fun. The fall even more so. I'll either go CrossfireX with a new board and CPU with 2 3870x2's, 1 4850 + a 3870x2 or just go single 4870x2. I expect the 4870x2 to do well in games that support Crossfire, and to be neck and neck with the die shrunk GTX280 at the very least. All for hundreds less.

johnnyxp64 said:

i disagree with the movement for AMD/ATI to work with Havok for their physics engine (38xx are compatible as 88xx)
because Havok is owned by Nvidia, they will NEVER give the best and high end technology in ATI cards as they will in Nvidia, i am afraid, BUT they are gona kick Intell's physics with there AMD Phenoms CPUs... :D 


Havok is owned by Intel, Nvidia owns Aegia. Intel uses Crossfire and, hopefully, will do so with Larrabee too. We need a single standard for dual cards and if both AMD and Intel support Crossfire, than that's a move towards a single standard.

Havok is used in more games than Aegia, and though Nvidia has the market share in cards right now, with a 3 way race in a year or so (especially with Fusion down the line in notebooks and entry level/hybrid Crossfire), I expect to see Havok supported more than Aegia.

Game companies need to realize that Nvidia's TWIMTBP marketing help (which amounts to cash freed up, even if none changes hands) just isn't worth it if they're forced to ditch DX10.1 or use Aegia instead of the standard supported by the two CPU companies.
June 18, 2008 7:09:21 AM

ok for the 4th time, i said i made a mistake in companies names havok-ageia no need to repeat it again.
read the previous posts!

still intel & ati together in vga physics it may work prety well but intel & amd physics in cpus thats a huge chapter...

i am sure intel is doing it so it can get more iside technology in later to come up with a Very competitive VGA line...

imagine

nvidia
ati
intel
matrox

will the 3first to fight for the 3D gaming/performance market...that would be great for us, but for ati/amd that will have to "help" intel unintentionally, will be pain in the ass.

time will tell...
June 18, 2008 7:50:52 AM

Havok is owned by Intel, not Nvidia. They own Aegia.
June 18, 2008 7:58:09 AM

In order to compensate for the above, and in response to the original point raised, I'm not looking for huge blazing power from the new generation of cards. There simply aren't the games around to use the extra power, unless you consider Crysis which is frankly as good as big girls grey pants anyway.

What I am looking for is a decent performance improvement at a decent price point (I don't mind paying for increased performance) and also smarter features. Power saving shouldn't be a nice to have, it should be mandatory. Features like CUDA are also important, although aside from Folding and physics, no practical applications have emerged yet for the home user (Photoshop CS4 may change this first).

I do not want an £500 hairdryer in my PC that gives me an extra 5 fps and puts my power bills up £200. I want smarter, quieter cards that get me decent gaming at 1920 x 1200.
June 18, 2008 2:39:49 PM

The_Abyss said:
In order to compensate for the above, and in response to the original point raised, I'm not looking for huge blazing power from the new generation of cards. There simply aren't the games around to use the extra power, unless you consider Crysis which is frankly as good as big girls grey pants anyway.


The only thing going for the GTX280 is a gig of DDR3. While more RAM is great, and games can use it, I'm wondering if ATI's solution of 512 meg, but faster GDDR5 on the 4870 and 4870x2 aren't a better route?

One of the first things entry level cards do is provide more last generation RAM than the high end gaming cards. It's marketing. Is it just marketing for Nvidia right now, or does the GTX280 need it for CUDA but won't benefit from it for gaming?


The_Abyss said:

I do not want an £500 hairdryer in my PC that gives me an extra 5 fps and puts my power bills up £200. I want smarter, quieter cards that get me decent gaming at 1920 x 1200.


When I spent $450 for the 3870x2 last February, it was the most I'd ever spent on a graphics card. Prior to that the high price for me was $250 for an AIW Radeon 9800 Pro.


June 18, 2008 4:43:18 PM

yipsl said:
The only thing going for the GTX280 is a gig of DDR3. While more RAM is great, and games can use it, I'm wondering if ATI's solution of 512 meg, but faster GDDR5 on the 4870 and 4870x2 aren't a better route?

One of the first things entry level cards do is provide more last generation RAM than the high end gaming cards. It's marketing. Is it just marketing for Nvidia right now, or does the GTX280 need it for CUDA but won't benefit from it for gaming?




When I spent $450 for the 3870x2 last February, it was the most I'd ever spent on a graphics card. Prior to that the high price for me was $250 for an AIW Radeon 9800 Pro.



GDDR5 on a 280gtx would be pretty much render useless because it has a 512 bit memory bus. Think 2900xt. Maybe it won't because GT200 is actually 8 64bit memory controllers but when you have enough bandwidth you really don't need anymore because you would need more fillrate to take advantage of it.
!