Sign in with
Sign up | Sign in
Your question
Closed

Nvidia vs ATI 2010

Last response: in Graphics & Displays
Share
January 1, 2010 8:13:20 PM

As everyone knows ATI dominated 2009 and the exciting battle continues to 2010. Many electronics such as the mobile products this year is going be most epic. AMD has released its evergreen on time but Nvidia has officially pushed Fermi to march. Most likely fermi>evergreen , but IMO the HD6xxx will be EPIC.

I am holding my breath till the HD6000 and I want to know which company do u think will dominate and give conclusive reason and argument to it.

More about : nvidia ati 2010

a c 231 U Graphics card
a c 81 Î Nvidia
January 1, 2010 8:20:35 PM

I'd say ATI dominated the last quarter of 2009.

As for what's next, it's pure speculation and no one here has any real facts upon which to form a logical opinion.
a b U Graphics card
January 1, 2010 8:32:17 PM

im still waiting for the openCL initiative to bear fruits. ATi's hardware-physics alternative is still nowhere to be seen (2 years overdue). if it does come-out then bye bye nvidia. eyefinity is still a tad too expensive, and a dx11 killer-app is nil.

nvidia is trying to murder the lucid chip which was suppose to be exciting.

i say ati already have a near perfect execution with its hardware releases, which gives them the upper hand.

i have a feeling that if nvidia fails with fermi, ati will do to nvidia, what nvidia did to 3dfx.
Related resources
January 1, 2010 10:31:20 PM

how do you know Fermi will really be stonger than Ati's current cards?
I haven't really looked around, but what are the spec's?
Maybe Nvidia pushed Fermi back cuz it wasn't stronger than the Evergreen cards.
a b U Graphics card
January 1, 2010 10:34:38 PM

ATI will continue bludgeoning Nvidia with a spiked club until Fermi is released...
a b U Graphics card
January 1, 2010 10:36:24 PM

Bluescreendeath said:
ATI will continue bludgeoning Nvidia with a spiked club until Fermi is released...


And maybe even after Fermi is released.
a b U Graphics card
January 1, 2010 11:07:59 PM

That must explain how nVidia is 2 years ahead of ATI in using: DX11, (G)DDR4, (G)DDR5, Tessellation engine, DX10.1.

I smell nVidiot.
a c 1406 U Graphics card
a c 160 Î Nvidia
January 1, 2010 11:51:58 PM

Quote:
Nvidia has always been like 2 years ahead of ATI. Nvidia is far better. And will always be.

Facts please! That statement needs something to prove it! Please enlighten us!
a b U Graphics card
January 2, 2010 12:23:02 AM

Quote:
Nvidia has always been like 2 years ahead of ATI. Nvidia is far better. And will always be.


and this is why i called you an idiot in another post, right now ati is ahead, even HD4k vs G200(b) was close (4870 ~ GTX260, 4890 ~ GTX275, 4870X2 ~ GTX295, the 4870X2 actually pulls ahead of the GTX295 because of driver improvements
a b U Graphics card
January 2, 2010 12:27:25 AM

Quote:
Nvidia has always been like 2 years ahead of ATI. Nvidia is far better. And will always be.

Ahh, that explains how ATI is pummeling Nvidia at every market segment right now.


Oh...
Wait...
January 2, 2010 12:35:44 AM

ATI was out first with DX11, which will pay them off, like the 8800GTX was the first DX10 card on the market... and we all know the kind of situation ATI was in with the HD3XXX with around 8 months late...

IMO, Nvidia is now in the shoes of ATI back in the HD3XXX days and vice versa with ATI with the 8XXXGTX.
a b U Graphics card
January 2, 2010 6:53:17 AM

redgarl said:
ATI was out first with DX11, which will pay them off, like the 8800GTX was the first DX10 card on the market... and we all know the kind of situation ATI was in with the HD3XXX with around 8 months late...

IMO, Nvidia is now in the shoes of ATI back in the HD3XXX days and vice versa with ATI with the 8XXXGTX.


My thoughts exactly. It is funny how some people can have very short memory.
a b U Graphics card
January 2, 2010 7:11:56 AM

ATI is the crown holder.. Nvidia will break something out that will cost 600 then Ati will lower prices.. That is when I will pick up a new Ati card
January 2, 2010 7:17:23 AM

We all must remember there are less scientists working at graphics segment at ATI than Nvidia. I still love tegra, but its not getting the attention that it is capable of.
a b U Graphics card
January 2, 2010 7:26:15 AM

right now ATI hold both the performance AND price/performance crown. normally they duke it out, but fermi is delayed and given ATI a chance to pull ahead.

the way i see it, is that ATI have done something amazing with the 5 series cards. the performance jump between this and the last gen is absolutely huge. much more than most generation leaps.
Nvidia need to really come up with something good to come out on top. and even when they do, it wont be cheap.
i really dont think nvidia have it in them this gen. and i think they will be playing catchup for the rest of the year.

even if the extra time on fermi allows it to rival the 5870 (i doubt they could surpass it, given the massive performance leap) it will be so much more expensive by the time its released, that only die hard nvidsia fans (fanbois?) will buy them.

ive got nothing against nvidiam my alegience goes with whoever currently offers thebest cards at the best price. right now, that ATI without a doubt. and i have a feeling it will stay that way.

but if nvidia do come out with sometihng more powerful at a reasonable cost, ill surely be getting one instead of a 5870.
but right now, im expecting to be purchasing a 5870.
January 2, 2010 9:16:51 AM

If anything, I only see ATi's lead growing through 2010.


*Assuming* Fermi does come out in March/April... there are a couple of fundamental issues Nvidia have to overcome:

1. It is not competitive per mm^2 of chip compared with Evergreen.
2. Even the top of the line Fermi is not going to be quicker than Hemlock.


Thus, ATi can set the prices from high end, and can scale them appropriately to ensure they (ATi) make a profit, and Nvidia cannot.


Two other issues are:

1. With Nvidia's design team spending so long on Fermi, how can they expect to catch up in 1 generation? Can Nvidia's finances afford them to take several generations to catch up?

2. Nvidia have to compromise the graphics performance for compute performance. ATi can offload this to their (AMD) x86 CPUs... indeed, the bulldozer archiecture is the first step toward designing CPUs and GPUs as one integrated system which load-balances depending on FP/Int requirements. This is a fundamental problem with their (Nvidia's) strategic direction that they (Nvidia) need to sort - otherwise they will never be competitive with ATi on a per mm^2 basis.
a c 105 U Graphics card
a b Î Nvidia
January 2, 2010 11:53:12 AM

I think Nvidia will have the better card this year. Seems one way or another the green team seems to put out a better performing card ( gaming ) on a more regular basis than the red team..... will I buy one...... not sure yet. Don't have a need.

The 4870x2 faster than the 295 or better performing..... I disagree with that personally. Nvidia plays more games better than the red team allowing physics along with it....... yes, Physics does matter. If you think it doesn't you're not being honest with yourself nor are you a dedicated gamer.

Prices..... I think eventually ati will lower it's prices but...1. when it comes time to replace the 5000 series..( old )...2. the low price higher powered cards are a lost thing. Nvidias next card will keep prices up..... will probably be more expensive than ati's 5000 series release therefore ati has no incentive to lower prices. 3. You and I will be caught in another price gauging scheme by both companies if we want anew card.
January 2, 2010 11:53:35 AM

In this market you'd better be able to make the cheapest product or the fastest product... and nothing I've read indicates Fermi will be cheap. Soo... it had BETTER be fast for Nvidia's sake.
a c 175 U Graphics card
a b Î Nvidia
January 2, 2010 1:07:58 PM

Come on Jack. If we keep this civil and back things up with links this could be fun. Ill start as I always do, at the beginning.

AMD.

Seeing as they've released most of their line up already 2010 will be mostly quiet from their GPU division. They will release the last of the 5xxx cards around the time Nvidia brings out Fermi. If Fermi is a huge hit we might see the 5890, but I don't believe AMD is planing on making that card. The biggest problem I see for them is a lack of a card at $200. The 5770 is close, but its really a $170 card. The 5850 is the next card up, but is $300. They need a 5830 or a 5790 at $220.

Nvidia.

Fermi will come out, and I believe it will be around march/april like they say. What will be surprising to some is the performance. The top end will be faster then the 5870. It will also be close to the 5970. The high thermals/power draw will prevent it from turning into an X2 card. (they could/will end up using the 448SP card for that however.) As they've already said, they will still use the G92 for the midrange. AMD should sell lots of 57xx cards to people looking for DX11 on the cheap. After that I'm not sure.

Big questions remain. Will Nvidia manage to scale fermi down to the lower ends? Will TSMC manage to keep both companies supplied with cards? Will Intel decide to enter the market once GF does? (ok, thats more like 2011-13.) Will Nvidia's software based Tessellator work as well as AMDs? Interesting times ahead for sure.

Guys, please remember that this is not a race to see how many fanboys can post per page.
a c 376 U Graphics card
a b Î Nvidia
January 2, 2010 1:20:12 PM

4745454b said:
The 5850 is the next card up, but is $300.

You forget that the HD5850 was supposed to be a a $260 card but shortages, supply/demand and no competition to balance it out have inflated the price. My guess is that it WILL be the $200 card. I suspect it will get down toward its release price by the time the new Nvidia cards come out and then if they are at all competitive it will start heading down to $200. There is a big gap between the HD5770 and HD5850 though, something to fill it would be nice.
a b U Graphics card
January 2, 2010 2:49:32 PM

I think Fermi will be competative power wise, but way out of line price wise. ATI will slash prices on their cards, nvidia will be forced to milk its loyalists with excorbitant pricing to stay afloat. By next christmas we will be getting details (if not leaked benches) of the 6xxx, while Nvidia rebadgeds fermi into the 4xx series on a smaller process. I guess I am not expecting much to change.
a c 105 U Graphics card
a b Î Nvidia
January 2, 2010 3:26:49 PM

Oh and regarding swifty's comment about physics. I was playing a game just earlier and didn't fall through the ground or anything like that, weapons still hit the A.I as well. Physics seems to still work fine with ATi. Are you sure you didn't mean Physx?[/quotemsg]

Dats wut I wuz tryin ta say.....fizziks wid a malcolm on da end....... worrrrrd homey.
a c 175 U Graphics card
a b Î Nvidia
January 2, 2010 3:49:29 PM

Quote:
I think Fermi will be competative power wise


Power or performance? It will be competitive in terms of performance. It will use more power/electricity however.
a b U Graphics card
January 2, 2010 3:59:53 PM

I was speaking graphics-power, as we are talking about performance, not electricity efficiency.
a b U Graphics card
January 2, 2010 4:41:14 PM

welshmousepk said:
right now ATI hold both the performance AND price/performance crown. normally they duke it out, but fermi is delayed and given ATI a chance to pull ahead.

the way i see it, is that ATI have done something amazing with the 5 series cards. the performance jump between this and the last gen is absolutely huge. much more than most generation leaps.
Nvidia need to really come up with something good to come out on top. and even when they do, it wont be cheap.
i really dont think nvidia have it in them this gen. and i think they will be playing catchup for the rest of the year.

even if the extra time on fermi allows it to rival the 5870 (i doubt they could surpass it, given the massive performance leap) it will be so much more expensive by the time its released, that only die hard nvidsia fans (fanbois?) will buy them.

ive got nothing against nvidiam my alegience goes with whoever currently offers thebest cards at the best price. right now, that ATI without a doubt. and i have a feeling it will stay that way.

but if nvidia do come out with sometihng more powerful at a reasonable cost, ill surely be getting one instead of a 5870.
but right now, im expecting to be purchasing a 5870.


Well actually, the leap from the 7900GTX to the 8800GTX was probably higher, as it was the performance of 2 7900GTX's in SLI.
The leap from the 3870 to 4870 was also quite big, as the 3870 ~ 8800GT, and the GTX 260 was around 2 8800GT's in SLI.
a c 106 U Graphics card
a b Î Nvidia
January 2, 2010 4:50:05 PM

The longer nVidia waits, the worse it's position will be. Sure I expect the GT300 cards to be faster in DX10 titles, but what about DX11? Right now it doesn't matter much, but currently developers are using ATI's DX11 cards since they are the only game in town. As developers optimize games to run faster on ATI's cards it will put NVIDIAs offerings at a disadvantage.

If you recall the 2900 launch (I know think back), It often sat between the 8800GTX and Ultra in DX9 benchmarks, but was all but humiliated in the DX10 benchmarks. Early reviews often sided with the 2900, but for newer DX10 games, which were often developed with nvidia hardware, the 2900XT did not, and does not fair as well. This effect lasted to the 3800 series cards and only with the 4800 series cards did ATI fix enough issues and gain enough ground to be truly competitive. Of course, the 4800 series cards are arguably less efficient than their NVIDIA counter parts simply because their arrangement of shaders requires more care by developers to be properly optimized.

So now here is the problem for NVIDIA. The longer they delay, the more time developers have to optimize games for ATI's DX11 offering. The more they optimize for ATI's grouping of 5 shaders, the more it erodes the advantage of NVIDIAs independent shader arrangement. If NVIDIAs more expensive (in terms of transistors, complexity, and of course $) GPUs fail to have a significant advantage in DX11 games, then their margins will be smaller not only for this generation, but for their next as it will probably have to be based on the G300. Admitting the approach of your competitor was right and redesigning a chip completely is expensive after all.
a b U Graphics card
January 2, 2010 7:13:40 PM

@ sabot, yes you're probably right. the 800gt was one of the msot impressive GPU's released. but the 58xx series aren;y far off in terms of performance gap.

my point is that its not typical, and if you remember the release of the 8 series from nvidia, ATI didnt do a great job of keeping up at the time. which is why i think nvidia will be in the same boat.

even if they do somehow release a card that out performs it (which to me, seems unlikely) they have no chance at offering it at a competetive price.

at the guy talkign about 'physics'. yes, physics are very important. PhysX however is not. its a gimmick, that does nothing but take pointles processig tasks that xould easily be done on a CPU, and lock them to an nvidia GPU. IMO, its insulting that nvidia do something like that. they arent doung anything revolutiionary. just encouraging competetors to add features for their platform only. luckily, we havn't yet missed anything worth expereincing under physx.

but back on topic, if nvidia can release a more powerful card for a competeitve price, ill buy it. but to me, that seems impossibler. because when fermi is released, a 5870 will likely cost 300 dollars. adn there is no way they could start their high end cards at anywhere near that price.
a b U Graphics card
January 2, 2010 7:56:33 PM

yeah, stop orcing.

a b U Graphics card
January 2, 2010 8:12:51 PM

This will most definitely be a peculiar competition.

The HD 5800's seriously have shoved a foot up Nvidia's money hole in terms of performance not to mention price. So far the Fermi's fake benchmarks, fake pictures are definitely not adding to the appeal for the new generation of Nvidia cards.

It is highly unlikely that ATI will lower prices because of the GF 300 series, as they will most likely be priced exuberantly high, meaning they won't compete with ATI's 5800 series which means ATI will have no incentive to lower prices.
The only card's that may see lower prices, not because of competition but because of marketability are the 5700 series for three reasons:
1. Expansion of market share.
2. Make Nvidia's older models comparable to the 4800's series completely irrelevant.
3. Push in a wave of low cost DX11 cards, which will push DX11 game developers to optimize for the 5700/5800 series cards (as stated earlier in this thread).

In all honestly, I don't know how Nvidia is going to catch up.
@ Intel joining the GPU industry. Highly unlikely that they will be able to compete with Nvidia or ATI.
a b U Graphics card
January 2, 2010 8:32:19 PM

The low end and mids are a problem


As seen here, the G92s density still isnt on par with the3xxx on up or 5xxx series.
So margins favor ATI here, where scrimping is most important.

As for the higher ends, its all pretty much been said, Fermi is hot, a x2 solution wont be doable til a newer node comes.
As for fermis perf highend, itll be around where we saw the 200b vs the 4870 and the x2, in between, but, this time, itll need more power, and ocing may be stunted, so a less diverse product selection, something nVidias partners love to do.

As for DX11, itll take another hit if nVidia has no mid low parts, much like the DX10,1 situation

Perf wise, the low and mid solutions, if nVidia is stuck with g92 yet again, itll lose against the real 800 SP on down versions of fully compliant DX11 5xxx cards from ATI, and its important that nVidia gets their scaling down so 3xx series derivatives on mids and low end scale down for perf reasons, as well as DX11, and thus far, word zero on any tape outs or anything of that nature, while new infos is leaking on ATIs solutions as we speak
January 2, 2010 8:46:18 PM

Quote:
Nvidia has always been like 2 years ahead of ATI. Nvidia is far better. And will always be.



I agree with OKINI 55 :) 
a b U Graphics card
January 2, 2010 9:15:39 PM

thats what we call fanboyism.
Anonymous
a b U Graphics card
January 2, 2010 9:54:02 PM

The gap will widen. Everybody expects fermi to beat the 5870 but it will be close with some titles going either way. ATI will be ready to release a 5990 dual gpu with top binned parts by then.
a c 175 U Graphics card
a b Î Nvidia
January 2, 2010 10:28:42 PM

Quote:
that does nothing but take pointles processig tasks that xould easily be done on a CPU, and lock them to an nvidia GPU


Not from what I've seen.

http://www.hardocp.com/article/2009/10/19/batman_arkham...

Notice that once you go outside, performance drops horribly. There is no way thats playable. 0-6FPS? Not a chance. I'll grant you they've done their best to lock to to only their cards, but it can't really run on a CPU.

I'd also grant you that at present it doesn't add much. Its not like you can punch down walls now, or do X in game like you should be able to do in real life and have Y happen. Reading that review of Batman AA it adds more paper swirling in the wind! And more bricks! Wow! This is hardly game changing. I would think Eyefinity is a better gimmick, to bad it costs so much. Newegg currently has one HDMI monitor in stock with HDMI for $229 counting shipping. Add in another $150ish for another DVI monitor and assuming you have a monitor already you'd need $380 and up to use it.

JDJ, has that chart been updated? I see the 4770 up there, but not the new 5xxx cards. The number of shaders has been updated, so I would think they would be on the other side of G92 now. IF Nvidia fails to scale the G300 down like they failed with G200, then I don't see any reason to buy an Nvidia mid/low end card. Whats worse is what will happen to the industry. If Nvidia doesn't release lower end DX11 parts, I think this will hold back the release of DX11 games.
a b U Graphics card
January 2, 2010 10:35:27 PM

it cant run on a CPU because it was programmed for a GPU. the point is that much better physics processing can and HAS been done on CPUs (red faction for instance)

if the physx fetures had been prgrammed into a game as a primary intention, and not an afterthought, all of it could easily be handled by any CPU out there. Havok does much more than physx ever has. and thats CPU only.

to me, it seems the nvidia simply encourage devs to add some useless features that can only be used on an nvidia card. i don't know what incentive they offer them, but they could easily implement any of it within the standard game code.

and physx will never take off because of that. no dev wants to make a game that loses its appeal when played on an ATI card, and vice versa. nvida offer them the chance to try it and they do. but havok can do it all, better, faster, cheaper, and without alienating an audience.
a b U Graphics card
January 2, 2010 10:36:19 PM

welshmousepk said:
it cant run on a CPU because it was programmed for a GPU. the point is that much better physics processing can and HAS been done on CPUs (red faction for instance)

if the physx fetures had been prgrammed into a game as a primary intention, and not an afterthought, all of it could easily be handled by any CPU out there. Havok does much more than physx ever has. and thats CPU only.

to me, it seems the nvidia simply encourage devs to add some useless features that can only be used on an nvidia card. i don't know what incentive they offer them, but they could easily implement any of it within the standard game code.

and physx will never take off because of that. no dev wants to make a game that loses its appeal when played on an ATI card, and vice versa. nvida offer them the chance to try it and they do. but havok can do it all, better, faster, cheaper, and without alienating an audience.


Red Faction: Guerrilla & Crysis/Warhead/WARS really spit on PhysX.
a b U Graphics card
January 2, 2010 10:40:35 PM

Plus maturing drivers, as they often lead to perf increases, and tho Fermi will assuredly have this as well, especially with a new arch, theyll come slower, and be behind from the start
a b U Graphics card
January 2, 2010 10:53:07 PM

I dont have a later chart, sorry, and though this can only be a rough outline, its more to the point, 92b is larger than 5xxx series, and underperforms as well, so a double loss here, more expensive to make, and less perf makes for a huge disparity where wargins are extremely tight.
Not to mention the features

Unfortunately I cant remember the size comparisons between the 5770 and the 4770, theyre out there somewheres, but was under the impression theyre relatively the same, and even allowing for more for the 5xxx series, it wont make up for the lack of perf seen in the 92b offerings
a c 175 U Graphics card
a b Î Nvidia
January 3, 2010 3:33:25 AM

Quote:
the point is that much better physics processing can and HAS been done on CPUs (red faction for instance)


Ahhh, now I see your point.

Quote:
Unfortunately I cant remember the size comparisons between the 5770 and the 4770, theyre out there somewheres, but was under the impression theyre relatively the same


Unless I'm missing something again, how can that be? The only 40nm chip listed in that chart is the RV770, which is the 4770 is it not? If the 5850/70 have twice as many SP and is also at 40nm, wouldn't it HAVE to be bigger? I would be very willing to bet that the 5850/70 is quite a bit larger then the G92b. Pure guess would be just north of the G92. I agree that the performance would be better then G92b for the most part, and WAY more feature rich. You can't milk G80/92 for as long as Nvidia has and not expect to get passed on features.

a c 217 U Graphics card
a c 81 Î Nvidia
January 3, 2010 3:58:45 AM

4745454b said:
Whats worse is what will happen to the industry. If Nvidia doesn't release lower end DX11 parts, I think this will hold back the release of DX11 games.


One of the neat features of DX11 is that if the card that's running on the DX11 system (Win 7 or Vista with patch) is not DX compatable, it automatically scales back the new features without the dev's having to change the code. At least that's what I had read. This makes it easier on the Dev's creating DX11 games so they may be more likely to write for it.
a c 175 U Graphics card
a b Î Nvidia
January 3, 2010 6:18:00 AM

Meh, rumors and hearsay. I'm not sure I believe this. For one, DX11 adds not only the tessellator, but the compute shader as well. DX10.1 at least has the tess, but neither 10 nor 10.1 has the compute shader. I dont' see how cards that aren't programed to handle DX11 code could run it. If anyone has some concrete info on this I would love to see a link. Previous generations of DX could do this so its posible. But there were games that came out that REQUIRE DX9c, and if you didn't have it you couldn't play it.
a b U Graphics card
January 3, 2010 6:49:20 AM

Im refering to nVidias use of G92 for low and mids, where the 5xxx series has a better density, like the 57xx series, which is close to the 4770 in size
a b U Graphics card
January 3, 2010 8:33:37 AM

Its not like you can punch down walls now, or do X in game like you should be able to do in real life and have Y happen.
said:
Its not like you can punch down walls now, or do X in game like you should be able to do in real life and have Y happen.


not exactly batman or action oriented, but if a game was purposely designed to use PhysX, you'd actually get something interesting.

http://www.youtube.com/watch?v=zhZd3WU5l38

i guess physX needs it own genre to be effective.

as far as it still lives within the confines of a game's graphics-option, dont expect it to be game-changing in anyway. thats like asking 8xAA to change the way how you shoot those nade-launchers.
a c 175 U Graphics card
a b Î Nvidia
January 3, 2010 10:37:47 AM

How many games on the physX game list REALLY support physX? Everyone agress that Batman AA is the best game out there that supports PhysX. (ok, maybe not everyone but most.) If all PhysX does is make explosions bigger and add more things in the wind, I will never be impressed. I want to be able to shoot the roof out above a hiding bad guy so that it crushes him. I want to blow a hole in the wall and get around the door I don't have the key to. If I run out of ammo I want to be able to pick something up and throw it at someone. (wait, isn't there a game that allows you to do that? I'm pretty sure Half Life 2 uses Havok however.)
a b U Graphics card
January 3, 2010 11:01:22 AM

I want to blow a hole in the wall and get around the door I don't have the key to.
said:
I want to blow a hole in the wall and get around the door I don't have the key to.


completely off-topic but from a game-design point of view (at least from a very very very very novice game-design point of view), if you'd allow players to wield such ability then theres no point incorporating a key and have a door locked.

1. a game designer can leave the door unlocked. or
2. your blow-a-hole solution above.
3. distinguish breakable-doors to unbreakable doors ala dragon-age (obviously breakable doors wont have keys hiding somewhere as its rhetorical)

now having said that, all physX would do in this circumstance is make explosions bigger, thus the game-changing argument is null.

If I run out of ammo I want to be able to pick something up and throw it at someone. (wait, isn't there a game that allows you to do that? I'm pretty sure Half Life 2 uses Havok however.)
said:
If I run out of ammo I want to be able to pick something up and throw it at someone. (wait, isn't there a game that allows you to do that? I'm pretty sure Half Life 2 uses Havok however.)


crysis/warhead as well, not physx however.
January 3, 2010 11:07:55 AM

physx is doomed, my Ageia card is worthless now with an ATI video card. Nivida can bite my whopper.
January 3, 2010 3:44:02 PM

I heard it was possable but the whole thing with Ageia cards was that they were supposed to work with ANY video card.
January 3, 2010 5:47:53 PM

4745454b said:
The only 40nm chip listed in that chart is the RV770, which is the 4770 is it not? If the 5850/70 have twice as many SP and is also at 40nm, wouldn't it HAVE to be bigger?


The RV770 is 55nm.


4770 was a development of RV770 at 40nm.... a trial of the production process so ATi could do their homework for Evergreen.


Thus, while Cypress is bigger than RV770, it is not as much bigger since it is on 40nm as opposed to RV770s 55nm.


Help any?
!