Sign in with
Sign up | Sign in
Your question
Solved

Nvidia, ATi and the PhysX: What's the deal?

Last response: in Graphics & Displays
Share
November 3, 2009 5:53:43 PM

Hey there folks.

I was thinking about buying an ATi HD4870, when I realized that ATi cards don't support PhysX!... Or do they?

I read a couple of articles that it was actually very easy to get PhysX working on ATi, and that it works perfectly on those cards. I even read some articles that Nvidia made some good developments towards ATi and for PhysX, since it's supposed to become an open standard.

And then I read an article that Nvidia hardcoded some crippling code, disabling PhysX when it detected an ATi card was used as the main graphics card. My hopes were crushed like a miserable insect, even though I didn't expect it to be any different. Business is business after all, in it's many demented forms.

But I'm still wondering... What's the deal? With PhysX getting more and more popular, and at the same time less and less open, what's going to happen with ATi? Should I still even consider buying an ATi card? Aren't ATi users chronically crippled with the lack of support for PhysX?

More about : nvidia ati physx deal

a b U Graphics card
November 3, 2009 6:04:17 PM

I don't know...is Physx getting more and more popular ??? How many people do you know that buy Nvidia only for Physx support. Because I don't know any. In fact, I beleive ATI is gaining market share on Nvidia right now...does that mean Havok is getting more popular ? Of course not.
m
0
l
a b U Graphics card
November 3, 2009 6:06:43 PM

Chronically crippled without Physx...nope can't say I ever noticed it even when I had Nvidia cards capable of it. Oh yes I realise why now - it's because hardly any games make use of it, and even fewer of those are games worth playing.
m
0
l
Related resources
November 3, 2009 6:08:05 PM

@annisman:

That's the thing, I really don't know either! So much is unclear about this whole PhysX thing. Me saying PhysX is getting more and more popular is just my observation. The UT3 engine (can) make use of PhysX, the new Batman game uses PhysX, Crysis: Warhead apparently makes use of it...

To me it seemed like ATi was indeed doing better as well. That's why I'm so confused, and it's all caused by this PhysX thing..

That's why I'm here :)  I'm hoping you people could clarify things for me ^^;

@jennyh:

It's true that most of the games don't make use of it. And correctly so. Also, I'm not talking down of any of the parties. I'm just really confused about the PhysX thing and if it even really matters.

What I'm kind of scared of is that, if I'd buy an ATi card, I might end up being unable to play a couple of games simply because they use PhysX. That would be a really stupid reason, but still quite a scary one.
m
0
l
November 3, 2009 6:12:40 PM

To paraphrase jennyh:

PhysX blows giant donkey chunks.

It adds literally nothing to a game that non-proprietary software engines can't do equally as well, and is supported by like... 13 games? I can't remember the number but it's jack all.
m
0
l
a b U Graphics card
a b Î Nvidia
November 3, 2009 6:15:33 PM

Joshiii-Kun said:
Me saying PhysX is getting more and more popular.. Crysis: Warhead apparently makes use of it...


No it isn't, and no it doesn't.
Crysis uses their own physics engine.

Quote:
What I'm kind of scared of is that, if I'd buy an ATi card, I might end up being unable to play a couple of games simply because they use PhysX. That would be a really stupid reason, but still quite a scary one.


GPU PhysX isn't required for any non-demo game out there, and never will be, it's a questionable option, not a requirement. The naming is confusing, but GPU effect are just optional, the PhysX engine for the game is CPU based.
m
0
l
November 3, 2009 6:15:43 PM

RealityRush said:
To paraphrase jennyh:

PhysX blows giant donkey chunks.

It adds literally nothing to a game that non-proprietary software engines can't do equally as well, and is supported by like... 13 games? I can't remember the number but it's jack all.


My opinion as well. PhysX is a gimmick that has gone too far. However, it's here for now, and we're going to have to deal with it.

I'm just wondering if I can go for an ATi graphics card without unfairly comprimising performance.
m
0
l
November 3, 2009 6:20:22 PM

@TheGreatGrapeApe:

Excuse me! Crysis indeed uses it's own physics engine. I was visiting Nvidia's site on Physx, and then clicked on a "Games" link, thinking it would show me PhysX games, but that was the nZone Game Library! Silly me!

I see! PhysX is still CPU based? Hmm, that's interesting. So the GPU is actually only adding to the already working PhysX engine?
m
0
l
a c 130 U Graphics card
a b Î Nvidia
November 3, 2009 6:25:07 PM

Joshiii-Kun said:
@TheGreatGrapeApe:

Excuse me! Crysis indeed uses it's own physics engine. I was visiting Nvidia's site on Physx, and then clicked on a "Games" link, thinking it would show me PhysX games, but that was the nZone Game Library! Silly me!

I see! PhysX is still CPU based? Hmm, that's interesting. So the GPU is actually only adding to the already working PhysX engine?


Physics are CPU based. nVidia PhysX GPU's take that load off the CPU, for whatever stupid reason.

I think...
m
0
l
November 3, 2009 6:26:01 PM

the GPU will offload the physics away from the CPU to the GPU. If you don't have a dedicated GPU for physics, in that case the physics will be software rendered and the calculations will be carried out by the CPU itself. Also, even though you may have a physics card, the game might not support offloading physics calculations to the GPU, in which case that would still be taken care of by the CPU.

Since, the GPU is a lot more powerful than the CPU, it allows for more elaborate physics, which really are just frills. Look at the Mirror's Edge physics video at youtube by nvidia. The only difference with physics turned on are that you will find a lot of pieces of cloth, flags, cloth screens etc. around the game that are shootable, and have ragdoll effects. Now honestly, without that piece of cloth you just shot for no purpose at all, what would you have lost.

Physics might have a lot of potential, but in its current stage, its just a marketing gimmick.

PS - Thats also why the Aegia Physx card came with a bang and left with a whimper.
m
0
l
a c 130 U Graphics card
a b Î Nvidia
November 3, 2009 6:28:13 PM

You will not find yourself in the position of not being able to play a game because it supports Physx and you have an ATI card. Physx is just a graphical effect as TGGA said. I have quite hapilly played the Batman trailer onan ATI card and didnt feel the expariance was any the less for not using Physx.

Mactronix
m
0
l
November 3, 2009 6:29:35 PM

xbonez said:
the GPU will offload the physics away from the CPU to the GPU. If you don't have a dedicated GPU for physics, in that case the physics will be software rendered and the calculations will be carried out by the CPU itself. Also, even though you may have a physics card, the game might not support offloading physics calculations to the GPU, in which case that would still be taken care of by the CPU.


Hmm! So you're saying that it won't even work if I don't have a dedicated card? I only have one GeForce card and in the driver settings, it has PhysX enabled :o 

@mactronix:

Thanks for your response :)  A "testimonial" was really what I was looking for ;) 
m
0
l
a c 130 U Graphics card
a b Î Nvidia
November 3, 2009 6:30:47 PM

Joshiii-Kun said:
Hmm! So you're saying that it won't even work if I don't have a dedicated card? I only have one GeForce card and in the driver settings, it has PhysX enabled :o 

He's saying that not all game support PhysX. In fact, only a handful of them do. IF the game doesn't support physx, then it's just wasted money (on the card).
m
0
l
November 3, 2009 6:32:39 PM

shadow187 said:
He's saying that not all game support PhysX. In fact, only a handful of them do. IF the game doesn't support physx, then it's just wasted money (on the card).


He was also saying " If you don't have a dedicated GPU for physics, in that case the physics will be software rendered and the calculations will be carried out by the CPU itself", which was what I was referring to.
m
0
l
November 3, 2009 6:47:38 PM

Joshiii-Kun said:
@TheGreatGrapeApe:

Excuse me! Crysis indeed uses it's own physics engine. I was visiting Nvidia's site on Physx, and then clicked on a "Games" link, thinking it would show me PhysX games, but that was the nZone Game Library! Silly me!

I see! PhysX is still CPU based? Hmm, that's interesting. So the GPU is actually only adding to the already working PhysX engine?


Kinda.

If you think about it, physics is kind of an ideal CPU job, it's just number crunching for action/reaction mechanics, it isn't really a graphical thing (I know graphics is still part of a computer which means it is number crunching, but it is more brute force mass crunching, not "physics" type crunching which is more complicated computation, anyways...).

This is why most video games when they develop their physics employ the CPU. PhysX is just like a regular physics engine and was developed by a company that nvidia bought out. Nvidia at that point instead of running it off the CPU decided it would be a brilliant idea to run it off their GPUs instead (even though OC'd CPUs can MORE than handle advanced physics these days without any extra help and still complete the rest of their tasks in a timely manner) and make it proprietary so it could ONLY run on their GPUs.

What followed was them marketing PhysX to certain companies which used it in very few games, and then made their own crappier physics engine for it AS WELL if you didn't have an nVidia card. So in essence, the game designers had to make use of two physics engines instead of just one, and alienate an entire segment of the gaming population that didn't own nvidia cards who apparently weren't good enough for "awesome" physics.

Most game companies realize though that it is just easier to make their own physics (just one engine instead of doing theirs AND PhysX) which runs off the CPU and works well with EVERYONE's graphics card so they could do it in half the time and not piss off tons of consumers they wanted money from.

Which is why PhysX is massively retarded in every way.
m
0
l
November 3, 2009 6:57:04 PM

@Joshii: everywhere I said 'dedicated GPU for Phyx' I meant 'either a GPU solely for physics, or a GPU that supports physics (such as yours)'.

As you can see yourself, the latter was considerably longer to type, and hence I took the convenient way out.
m
0
l
a b U Graphics card
November 3, 2009 7:30:15 PM

Physics in games are number crunches with many complex formulas. The "ideal" hardware to run these simulations are GPU's. Why? Because only gpu's can crunch out near or surpassed teraflop's of computing power (HD 4800 series do 1 teraflop or higher). Something which even a high end CPU could not do.

But why have they not used GPU's in the past then for physics simulation or now in the present? Because most consumers in the past did not carry "high" end graphics cards that coud handle outputting graphics, anti aliasing, Anisotropic filtering, millions of bits of computing, rendering, cropping or resizing, some components of physic's engines such as impact models and the list goes on. They're already very busy bees when it comes to gaming, adding physics to the GPU only adds to the load.

@Present: Now we're getting to a point that GPU's are starting to computer teraflops of information, and adding physics is a potential feature in the near future. Physx is a great start, but however in no means a deal breaker.

Now you may ask me why then don't we let the GPU's do all of the computing instead of CPU's. Well we're already starting to implement cGPU's into mobile devices, however that is not the point. The reason why we still use CPU's is to run most of the programming of software. Unlike physics engines, most software is written in blank variable coding which a CPU is "trained" to fill in, ofcourse most of this coding is easy simple codes. Easy on programmers, hard on CPU's.This of course leads to inefficiency and lowered performance.
GPU's (and most server processor for that matter), are trained to perform insane number crunchings BUT must be provided with intensive coding on the part of the programmer. This is one reason some games take FOREVER to be released, because even 1 missing line of code of the program ran by GPU can produce glitches. They must write each line, each variable in. They cannot leave empty variables like they can with consumer CPU's. This ofcourse is more for programmers, and less work for the CPU. More work for programmers=more time=more hours worked=higher costs=higher costs for consumers and very tired programmers.
m
0
l

Best solution

a b U Graphics card
a b Î Nvidia
November 3, 2009 8:24:01 PM

For hardware accelerated physics, think of the GPU as a co-processor, it does not have the immediacy of the CPU to affect gameplay, however it is pretty god at dong the math of vectors which is god for physics.

But the main thing is that it's an add-on feature, not something required to the game. The game physics is currently always on the CPU, just the debris physics and cloth physics and such are on the GPU, so it's essentially add-on glitter, but not something that would limit the game if you didn't have an nV GPU dedicated to physX.

Future OpenCL implementations have more potential for being able to affect gameplay, but even that is likely a year or two away.

Right now it's just a feature like AA, optional, nice to some people, but not required.
Share
November 3, 2009 8:51:43 PM

Dont worry about PhysX, it will die in a year or two. Open standard physics will be implemented for all manufacturers.
m
0
l
a b U Graphics card
November 3, 2009 8:52:38 PM

OpenCL has the potential to become a standard but one cannot say it will be used in a few years since few software companies have decided to support it's coding. Hell even CS4 only uses OpenGL for zooming, and panning. If it is used for companies for programming CPU/GPU uses, that would be great but if it is not used then it remains utterly useless. I really dont see OpenCL becoming a standard in software. Specially not in a capitalist market, not that I have anything against a capitalist market.

Anyways most of it's performance is oriented towards video encoding/rendering.
m
0
l
November 3, 2009 9:10:05 PM

PhysX usual implantation hardly every change enough in any game to think you missed out on something only game i've played where there is a huge diff is in mirrors edge but it's just all debris which seems more like a waste of my gpu power then a benefit to the look of the game.
m
0
l
a c 234 U Graphics card
a c 84 Î Nvidia
November 3, 2009 9:37:18 PM

ATI fans are not entirely cut out of the PhysX deal.

http://www.tomshardware.com/news/nvidia-ATI-physx-patch...

The impact of PhysX, like any technology, will depend on the time, money, energy and effort that the game developer puts into it. If I judged PhysX by Darkest of Days or Mirror's edge, I'd yawn and not think about it anymore. If I judged the impact of DX11 on Battleforge, I'd label DX11 another dud like DX10 and wait for history to render it to obscurity.

The problem is we can't judge a technology by poor implementations of it. The technology can only be judged by its potential. Looking at this review, we get a better idea what PhysX is capable of:

http://www.firingsquad.com/hardware/batman_arkham_asylu...

It would be hard to argue that, all other things in the price / performance arena being equal, given a choice between:

a) being able to experience those features in a game
b) being shut out of experiencing those features in a game

every rational being would choose a)

The problem is "all things are never equal". Right now ATI's 5xxx offerings have pretty much snuffed out all nVidia's models except the 260 and 295 from serious consideration. Unless that price / performance target area is where you are headed, ATI is the only logical choice for your main card.

No doubt, die-hard nVidia fans will over state the importance and significance of PhysX saying it's not worth gaming w/o it and die-hard ATI fans will under state same, saying it's worthless. The truth, as always, lies somewhere in between. The features described in the Batman article are certainly things that add to the level of realism and depth of immersion which you will experience while playing the game. EVGA just came out with a dual GPU card which has a 275 as the primary GPU and a 250 for dedicated PhysX support indicating that they at least think there's a market for PhysX. Personally, I think the dual GPU thing, where you don't have a choice on how the 2nd card is used, is a bad idea.

Until nVidia releases their new generation cards, to my mind, unless you are considering the 260 or 295 price / performance targets, you should forget about nVidia as your primary GPU. If you keep GFX cards more than 2 years, I'd forget about those also as DX11 should, if we're lucky :) , start to be important 2 or more years from now. The 4870 outperforms the 5770 but the 5770 has DX11 and is more power efficient so you have a tough choice there also:

http://www.anandtech.com/video/showdoc.aspx?i=3658&p=14

I wouldn't buy a GFX card at this point in time preferring to wait until the vendors proceed beyond the "reference board" stage and nVidia's new stuff is out allowing comparisons. But, if I did buy today, when setting aside my budget, and buying say single or double 5850's / 5870's I'd certainly be looking to grab someone's old 8800 / 9800 GTS to add to the mix along side the ATI cards using the patch described in the THG link above.

Curtains that move when you walk by, leaves and papers that get disturbed, real looking smoke & steam, glass that breaks, well like glass and seeing walls and furniture that gets destroyed rather than "smudged" when ya hit it w/ a rocket launcher are things I'd pay $50-$100 for. Alternately, if you find PhysX compelling, the 260 and 295 remain viable options according to many reviewers in their price /performance categories. If you don't keep graphics cards more than 2 years, it's hard to find fault with that choice.

Only time will tell whether we see more games like Batman or more games like Mirrors Edge. Only time will tell if DX11 is a hit, or a dud like DX10. So the choice is still a gamble...then again what isn't ? One thing about PhysX is that it is supported by console games for XBox, PS3 etc. How that will effect things especially with PC gaming market share decreasing while console games are rising is another question that will be answered in time. Talk about gambles.....when ya think about the fact that for what it costs to buy this holiday season's top GFX card (5870x2) at $600, you could buy two PS3's you have to wonder just how will game developers decide to allocate their future resources ?

So to answer your question of "what's the deal ?" the only reasonably sure answer is "who knows ?" For the time being though, look at what PhysX brings to the table and make your decision accordingly based upon how much you think you will enjoy those features.
m
0
l
a b U Graphics card
November 3, 2009 11:57:59 PM

Okay a bit more educated post by JackNaylor with less bias and more ... wait it's still fail.

He first states that a technology cannot be judged by implementation but by "potential". But then goes on to speak about GTX 295/260 holding their own against the new 5800 series, really even if you look at potential only? Interesting how you get to choose what is accepted and what is not regardless of contradicting yourself. Anyways in response to his misinformation:

Potential wise, the HD 5800 series and in implementation they still win. While I do agree products should be rated by other things other then implementation I do not agree with putting paper specs as the sole basis of comparison; there is still the margin of performance between a theoretic throughput and real life implementation (regardless of how good or great implementation is the theoretical output is always vastly greater then realistic output). I prefer to rate a product by:
1. Implementation (How well is it implemented not how widely it is)...
2. Real life study (benchmarks, gaming trials, AA, Scaling etc)
3. Price/Value
4. Market share/market targets (Who is this product for, and how big of a market share is it?)
5. Presentation. (Looks) <-- almost redundant for me, although I admit I will take a better looking product and pay a little extra as long as it performs equally to it's competitors.
***Potential is not in here because paper specs mean nothing to end user use.

Why? Because of other factors such as handling of resources, physical hardware frictions, electrical inefficiencies, other hardware components etc etc etc all come into play when an end user receives a product.

There's a phrase I heard: "The greatest of ideas lay in a grave." What does it mean? Geniuses in every conceivable field have lived but did not express their ideas/or make effort to make known their ideas. Does that make them any bit less than a genius. Don't over hype potential if it doesn't deliver.

Two, I would not mind paying $100 or more for realistic collisions/environments that mold with the user's input (explosions, physical contact etc) but is it realistic?
No. Im afraid we are far from having true perfected collisions/environments not because of specs but inefficiencies with all hardware components not to mention the real reason: shear resources such as money to program such vast physics engines. Your talking about extending game production times from 1 year~ to perhaps 3 years~, not to mention cutting profits due to long times of waiting for products and technology advances making your game redundant.
Spoiler
Example: Crysis was revolutionary not because of it's gameplay, plot, story but because of it's setting of technical standards for games; superior user models environments, high texture use, increased shader use, better physics engine etc etc.


I agree that now is not a good time to jump bandwagon as a budget or smart buyer with the Hd 5700s/5800s series but it is a viable option for those to which money is of no concern.

@ remark of PC gaming /consoles market.

The PC gaming market share has always been small since the introduction of computers. Arcade machines actually started the video game world, consoles took it home NOT PCs. However even though PC gaming is a small market share, it is a very stable one because consoles cannot provide the performance, versatility, or quality of a PC no matter how much money you throw at it. Something which above average gamers want. I am not saying console gaming is obsolete because I do see the value in an affordable gaming system compared to the baffling cost of a gaming PC but that PC gaming is an irreplaceable market.

As far as I am concerned I have abandoned consoles all in all for PC gaming. While I do see this generation of consoles closing the gap on PC gaming (albeit still far from the quality of a gaming PC), it is still far from reaching PCs true value as a entertainment, multimedia, news, business or everything device..

I can count in one hand the number of people I know that own a next gen console system and not a PC (Mac or Windows operated).
m
0
l
a b U Graphics card
November 4, 2009 12:16:39 AM

Good post, but a lot of people who own consoles dont game on their pc's at all. That is the market that has to change, and i feel ATI are doing that.

Somebody did a breakdown on prices and figured out that now you can build a better gaming pc for less than an xbox or whatever. By better I mean, better graphically.
m
0
l
a b U Graphics card
a b Î Nvidia
November 4, 2009 5:07:30 AM

AsAnAtheist said:
OpenCL has the potential to become a standard but one cannot say it will be used in a few years since few software companies have decided to support it's coding.


More companies than have decided to go with GPU PhysX.

Quote:
Hell even CS4 only uses OpenGL for zooming, and panning.


Yeah, because it is a year old, and OpenCL wasn't anywhere near mature when it was in developement, and it's simply building on previous OpenGL support in CS3 and previous work in AfterEffects.

Quote:
I really dont see OpenCL becoming a standard in software.


It already is a standard, like OpenGL, and one supported by AMD, intel, nVidia, S3...

Quote:
Anyways most of it's performance is oriented towards video encoding/rendering.


No, it's not, that's simply it's primary first implementations, just like it was for consumer CUDA.
m
0
l
a b U Graphics card
November 4, 2009 2:54:15 PM

Quote:
More companies than have decided to go with GPU PhysX.

True but how many companies out of the whole lot? I have yet to see OpenCL in any game I play.

Quote:
Yeah, because it is a year old, and OpenCL wasn't anywhere near mature when it was in developement, and it's simply building on previous OpenGL support in CS3 and previous work in AfterEffects.

We're talking about OpenGL, not OpenCL. OpenGL was mature enough to be implemented vastly more in Cs4.

Quote:
It already is a standard, like OpenGL, and one supported by AMD, intel, nVidia, S3...

I didn't know AMD, Intel, Nvidia, or S3 made software for consumers. Thought they made hardware for consumers... It isn't a standard, it is just supported. OpenGL is only a standard in Macs, which has a bare 8%-10% market share depending on who you source. (Which most users don't even know what it does only that it's advertised as the "best API")

Spoiler
No, it's not, that's simply it's primary first implementations, just like it was for consumer CUDA.

Okay please show me proof of performance boosts in games, and other applications besides photo/video editing (mac uses OpenCL for video/photo editing, thus giving faster rendering times than a custom CPU based render). I have seen several displays of OpenGL+ OpenCL havok physics, and I am not impressed.

OpenCL may turn out to be another OpenGL. Standard for professionals, non-standard for consumers.
m
0
l
a c 130 U Graphics card
a b Î Nvidia
November 4, 2009 3:17:47 PM

Anyone know where my tin hat is ?

Mactronix
m
0
l
a b U Graphics card
November 4, 2009 3:23:14 PM

Its far to early on to claim any usage on how its going to go., so I'll just stick to the opinion saying, as in, everybody has one.
m
0
l
a c 234 U Graphics card
a c 84 Î Nvidia
November 5, 2009 4:14:48 AM

AsAnAtheist said:
Okay a bit more educated post by JackNaylor with less bias and more ... wait it's still fail.


My only bias is against dogma being presented as universally accepted fact. I get it, any suggestion that any other product is perhaps worthy of some consideration, and that the purchaser may have differing views than your own is blasphemy. Get the stake in the ground, gather the kindling and burn the heretic !

As a little league coach, I never want to see the same team win every time and as a comsumer, I feel the market is better served when the competition is close and the last few years it hasn't been. I actually am enjoying watching nVidia squirm atm. But my post was about PhysX with only a side reference to the two cards.

Be that as it may, I will address your points. Making value judgments based upon YOUR values and trying to declare that everyone must drink the kool-aid and jump on board with you in unquestioned blind faith doesn't fly. I'm quoting published reviews from well respected non biased sites which have stood up to peer review. Broad brush accusations simply don't carry the same weight.

Quote:
But then goes on to speak about GTX 295/260 holding their own against the new 5800 series, really even if you look at potential only? Interesting how you get to choose what is accepted and what is not regardless of contradicting yourself. Anyways in response to his misinformation:


"Potential" is a word that applies to PhysX and DX11. But forgetting the out of context quote for a moment, if your are going to point a finger at somebody for misinformation, at least make a cursory check of the"facts". It should be apparent that the 260 does not compete with the 58xx series but why let accuracy get in the way of a good rant ? The comparisons, at least all the ones I have read, is 295 to 5870 ..... 260 to 5770. Contradictions ???? I'm not alone here....or is anandtech full of misinformation and contradictions too ?

"AMD was shooting to beat the GTX 295 with the 5870, but in our benchmarks that’s not happening. The 295 and the 5870 are close, perhaps close enough that NVIDIA will need to reconsider their position, but it’s not enough to outright dethrone the GTX 295. NVIDIA still has the faster single-card solution, although the $100 price premium (now as low as $65) is well in excess of the <10% performance premium."

Gee look at that ....I've been saying 5-10% , he said < 10%. Did I miss the contradiction ?

Now lets look at the 260 vs 5770.

http://www.anandtech.com/video/showdoc.aspx?i=3658&p=14

"The value of the 5770 in particular is clearly not going to be in its performance. Compared to AMD’s 4870, it loses well more than it wins, and if we throw out Far Cry 2, it’s around 10% slower overall. It also spends most of its time losing to NVIDIA’s GTX 260, which unfortunately the 4870 didn’t have so much trouble with. AMD clearly has put themselves in to a hole with memory bandwidth, and the 5770 doesn’t have enough of it to reach the performance it needs to be at. If you value solely performance in today’s games, we can’t recommend the 5770. Either the 4870 1GB or the GTX 260 would be the better buy."

Still no contradictions.

Quote:
Potential wise, the HD 5800 series and in implementation they still win.


Before you can decide who wins, you have to define what winning is. I think "who goes to heaven and what ya get when ya get there is defined differently when you are talking to people for different religions. Your gaming heaven isn't my gaming heaven and my gaming heaven isn't someone else's. Looking at your handle, I'm a bit surprised at the blind faith.

I agree with anandtech's articles. No doubt that we could find people who disagree but disagreement doesn't mean that either side is necessarily an idiot. IMO, it's hard to make a case against ATI's lineups except in two specific instances....and those are the two I quoted. Apparently, this partial unacceptance of ATI's overall across the board and unquestionable superiority bothers you for some reason but I didn't run the benches or write the reviews. I am just pointing them out to people.

I'm not saying my (er...anandtechs's) position is absolutely right and the other is absolutely wrong, I am saying that both are valid points of view and I'm a comfortable enough with my position not to feel threatened if someone disagrees with me.

Different strokes for different folks. When you select a car, what "wins" is what meets your needs.....a hubby with wifie and 3 kiddies and a spouse outta work has different judgment criteria then the single high powered exec in his 50's. I got no beef with the hubby who buys the conversion van as it fits his needs and his budget .... I got a beef with the hubby if he's rationalizes away that neither he nor anyone else would enjoy driving a Porsche. If it's me though, I, thinking why not both ? Van and Porsche .... twin 5850s and a GTX something or other for PhysX.

Again, I am not saying the 295 is the best or only logical choice I am not saying the 260 is the best or only logical choice or that YOU should buy either one. I am arguing against the position that the 5870 and the 5770 are the only cards that any one should consider and that anyone who makes a choice different than what you would choose is an idiot.

The Yankees had a payroll last year of 209 million, the Phillies had 98 million. Which was the better "value". The Yankees owners paid twice as much for a team that won less than 10% more games. Do you think that tomorrow morning Yankees fans will be bowing their heads thinking they didn't get the enough "value". Seems to me if the Phillies wanna be the Yankees , the owners oughtta reach into their pockets and pull out some more dough. If ATI wants to be the Yankees they oughta put out the 5870x2 and knock NVidia's crown off. Of course when that happens, and ATI has the most expensive card on the market, I expect the value argument will lose its luster. But to my mind, if the 5870x2 beats the 295 by 5-10% and it increases system cost (not GFX cost) accordingly, this hardware whore gonna put it on top of his favorites list where it will sit until I see what nVidia counters with.

Decision making and rationalization are two different things. Making a value choice to fit your budget is a sound decision making process. You want to buy a 5870 because it's comes very close to matching the top dog for a significantly lower cost, then great your reasoning basis is sound and I have no issue with it. Rationalizing that anyone who chooses to buy a product that better fits their particular needs and wants or that better than the one you got must be an idiot is not a road I am going to take or give other folks directions to.

EDIT: Seems THG doesn't see the contradiction either and wholly supports the viability of the 260 / 295:

http://www.tomshardware.com/reviews/best-graphics-card,...
Winner Best Graphic Card for the Money ($140-$200)
Three way tie:
ATI 4870
ATI 5770
nVidia 260

http://www.tomshardware.com/reviews/best-graphics-card,...
Winner Best Graphic Card for the Money ($300-$350)
Three way tie:
Three way tie:
2 ATI 4870 in XFire
2 ATI 5770 in XFire
2 nVidia 260 in SLI

http://www.tomshardware.com/reviews/best-graphics-card,...

"Despite ATI's new Radeon HD 5800-series, Nvidia's GeForce GTX 295 (with SLI-on-a-board) is the most powerful single graphics card on the planet. Essentially two conjoined GeForce GTX 275s, the GeForce GTX 295 offers very notable gains over the Radeon HD 5870 in the great majority of game titles, although the Radeon will use far less power doing so.

To get more performance than what Nvidia's GeForce GTX 295 brings to the table, you'd have to look at more expensive solutions costing over $500, say a couple of Radeon HD 5850s in CrossFire. But unless you have a 30" monitor, that would be a gratuitous waste of cash considering the small performance gains you'd get for spending a whole lot more money."


The big winner in the roundup is the ATI 4xxx series which won 6 categories to just 2 for the ATI 5xxx and the nVidia 2xx.
m
0
l
a b U Graphics card
November 5, 2009 4:43:33 AM

Remember, this is a new approach, using differing shaders, a few other changes etc.
These cards were also rushed out the door, with many bugs and low perf from their drivers. Im of the opinion theyll be doing better, and where ATI wants them, and it all depends on how you bench the 5870 vs the 295, and the 5870 will only get better.
Like Ive been saying, recheck these results when they rerun these benches after a few driver updates when the x2 products release, and consider the new games, not the DX11 games either, mwhich of course the 5xxx series will own in, but the newer ones, it all depends on how and whats being benched
m
0
l
a b U Graphics card
a b Î Nvidia
November 5, 2009 6:50:31 AM

AsAnAtheist said:

True but how many companies out of the whole lot? I have yet to see OpenCL in any game I play.


Funny I've yet to see PhysX implemented in any game I like, sofar the CPU does all the best game phsics because it's the only one currently doing games physics.

Quote:

We're talking about OpenGL, not OpenCL. OpenGL was mature enough to be implemented vastly more in Cs4.


Actually, no, programming in OpenGL was not that mature, and the nature of OpenGL with limited data parrallelism, makes it much harder to work with than OpenCL , and what they did with OpenGL was far beyond anyone else at the time, and only recently have other NLE programs and photo editors caught up. And that's the point, PhysX is similarly limited, whereas OpenCL and Direct Compute offer game physics as a future that PhysX just can't hope to offer.

Quote:
I didn't know AMD, Intel, Nvidia, or S3 made software for consumers. Thought they made hardware for consumers...


Really? Guess you didn't read the title of the thread to help you with one of the 4? :sarcastic: 
You can always research the other three on your own, but either you need to get better informed or else be a little less obtuse. :pfff: 

Quote:
It isn't a standard, it is just supported. OpenGL is only a standard in Macs,


Yes it is a standard and promote by a company with standard in their moto, and also refered to a standard by the IHVs and ISVs, so one again you really need to research harder homer, including looking into this other thing called Linux. :kaola: 
And while these two rely heavily on this open standard API, windows also supports it even if M$ directly competes with it.

Quote:
Okay please show me proof of performance boosts in games, and other applications besides photo/video editing (mac uses OpenCL for video/photo editing, thus giving faster rendering times than a custom CPU based render).


So now you want to change the rules, from my original statement of "Future OpenCL implementations have more potential for being able to affect gameplay, but even that is likely a year or two away." but I'll give you a demo now that isn't games, but also not photo/video editing: Fire and Ice baby;
http://www.youtube.com/watch?v=7PAiCinmP9Y

As for performance boosts, you can also look at Direct Compute in a similar fashion and it's speed-up of SSAO in games, so if you're so anti OGL / OCL, then try Direct Compute if it makes you feel more warm and fuzzy as a 'standard'.

Speaking of which, how about a video that discusses the OpenCL Industry standard? Saying Standard I think a standard 5 times, which may or may not be standard for such videos. :p 
http://www.youtube.com/watch?v=5-4EKSd3kYQ

Quote:
I have seen several displays of OpenGL+ OpenCL havok physics, and I am not impressed.


I'm not impressed by physX, I showed what you asked for, now it's your turn, you show me a GOOD game that uses GPU physX for it's gameplay physics, after years of promising it. Unlike OpenCL, there's no 'it just got launched' excuse for PhysX' so get me that gameplay proof that Ageia promised about half a decade ago.

Quote:
OpenCL may turn out to be another OpenGL. Standard for professionals, non-standard for consumers.


It's a standard for consumers too, whether you understand that or not, more consumers use OpenGL than professionals, and both are already accepted standards, even if they don't qualify for you and whatever body you claim to represent by saying they aren't standards.
m
0
l
!