Sign in with
Sign up | Sign in
Your question

Am I "silly" to feel more comfortable sticking w/Nvidia?

Last response: in Graphics & Displays
Share
August 20, 2008 9:46:05 PM

I want to get a new GPU for all the new games coming out in the coming months. I know the recommended card right now is the ATI series, and so I was looking at the 4870. For the same price I could also choose the GTX 260.

I have *always* had an nvidia card and I know in the past, its always worked great for me, I've appreciated their good driver support, etc. Something about switching to an ATI/AMD card makes me nervous...

Is that just ridiculous? Is there any reason why I'd want to stick with nvidia and get the GTX 260 next month instead of the ATI 4870?

My only other thought was, the eVGA 90 day step up program and the possibility of nvidia releasing a new card in q4 2008... then I could upgrade to the dx10.1/gddr5/die shrink update...?
a b U Graphics card
August 20, 2008 10:05:47 PM

Well, i feel kinda the same way for ATI...

I can't blame you, both cards have good strong points wich are important for some.

I can tell you my point of view: I LOVE AVIVO and it's features. The easy TV and Dual displays ATI has in their cards. The image quality and everything related to playback on them. I know nVidia has PureVideoHD but i don't like it. My GF has an nVidia card (7800GS) and i don't like the "feeling" to it when i use it and neither does she (she went from a Radeon 9600SE to the 7800GS, AGP).

I know it has weak points also: Drivers and Linux Support (wich is actually changing). Performance (not price/perf, good old FPS) and some game compatibility issues (CF headaches anyone?).

Well, try to inform yourself about what u're gonna get and those worries should minimize (not disapear imo) and let you have a happy affair with ATI. Focus on a secondary aspect of the card (complex calculation, image quality, scalability, DX support, etc...) and you should have an answer for yourself.

Esop!
August 20, 2008 10:09:40 PM

Performance is definitely close enough that you really can't go wrong, especially with Step-Up.
Related resources
August 20, 2008 10:15:11 PM

Not at all. Both are great companies, each with their pros and cons. Stick with what you are most comfortable. The differences in performace tend to lean towards ATI at the moment, but that should not justify anyone from changing out of their comfort zone. The same would be true if Nvidia was on top as well.
August 20, 2008 10:30:11 PM

Whatever blows your hair back,but brand loyalty is for fools.
August 20, 2008 10:40:30 PM

If you really like the whole nvidia feeling, their software support and so on
and it doesn't bother you that much that you can get a slightly/medium
more performance for bit less money you should go for nvidia
I'm not being sarcastic here or anything.

few months ago I thought I was going to get nvidia I thought
it would have better performance/price ratio (since the 8800 gt)
I too was feeling kinda weird by the idea of getting an nvidi card :p 
good for me I won't have to change brands.

it's good old ati for me :) 
August 20, 2008 10:48:23 PM

Ogdin said:
brand loyalty is for fools.


+1 +1 +1

The only people who lose are those that are fanboy's and blindly buy based on a brand or "feeling".

In your case both cards are good, but personally I think the ATI card is a little better so i would get that one.

I have owned both brands over the years. I buy the best card for the money period, I don't waste dollars on feelings or good vibes from a company.
August 20, 2008 10:58:57 PM

I have always driven Fords pick-ups because they have always come through for me. If I went out next year and bought a Chevy because it was a little cheaper and could go a little faster I would be a fool. I'm in the same boat as you and I'm waiting for a GTX260+. Just remember if you buy from Evga now and the new cards are not released until December you're screwed. Until we hear something directly from Nvidia nothing is written in stone about a Q4 release.
August 20, 2008 11:00:46 PM

Operative word being "little". In the case of nV-ATI some loyalty doesn't matter.

If it is like Intel-AMD, though, brand loyalty would royally hold you back.
August 20, 2008 11:04:46 PM

Choose the card by its usefulness, not by what ppl say.

If the card plays COD4 better than other cards, but sux at Crysis, and you like COD4...Get the COD4 card. Don't just buy cards because it can do a game like Crysis.

Don't get Top of the line if you barely play anything lol. And don't buy just because omg Nvidia or ATI said its the fastest. Research Research Research ppl.

Buying by brand over results, is like buying putting your hand in a bag and just randomly taking stuff out.:D 

The 9800 XT looks and sounds like an awesome card! but it plays Doom 3 at just about 30 fps:)  (which was awesome for its time though), though I don't see it doing COD4 the way its meant to be seen:D 
August 20, 2008 11:20:08 PM

It's not about brand loyalty, I've switched between AMD/Intel etc. But all I know is, nvidia has historically had greater market dominance and excellent drivers and support all across the board. ATI is only slightly up on nvidia right now, and in the long wrong, I wonder if Nvidia would be the better choice because its overall market share/dominance, compatibility/drivers, CUDA, etc. I usually stick with a card for a long time (hence I am just upgrading from a 7800gt that I bought nearly 3 years ago).

I also do a lot of video editing on my system...

As for the evga step up program, I would be buying the card mid September... about a month from today... when new games start showing up... so I'd have until mid December if nvidia refreshed the lineup with a die shrink, gddr5, and dx10.1 .... if they didn't, no biggee, I'd probably end up with a Larabee chip in a year or so anyways, but if they did refresh, it would be nice to spend a little extra and step up at year's end.
August 20, 2008 11:55:23 PM

Sticking with "Nvidia" is a foolish..


However, buying an Nvidia gpu based on the fact that its an EVGA product is not foolish - they offer excellent services.
August 20, 2008 11:55:25 PM

nVidia driver support for their older cards is poor, and their Vista drivers have been quite bad. I got to the point where when my 7900GTX crapped out a little while ago I haven't even bothered to do the RMA yet, its a BFG. Worked fine under XP, but not under Vista. Also keep in mind that to get full HTPC functionality out of nVidia you have to pay extra for the Pure HD driver or whatever they are calling it this week, ATI doesn't gouge you for extra money to use the full functionality of their cards.
a b U Graphics card
a b Î Nvidia
August 21, 2008 1:17:59 AM

dannyaa said:
It's not about brand loyalty


Sure seems like it.

Quote:
But all I know is, nvidia has historically had greater market dominance


Even when they had poorer cards, because people like you are stuck on brand, not performance/utility.

Quote:
and excellent drivers and support all across the board.


That was then, this is now, and recently there's little difference between the two overall in that area.

Quote:
I wonder if Nvidia would be the better choice because its overall market share/dominance,


Sure because sellng a ton of GF9300 & 9500 cards matters to your segment choice, right?
It would matter if ATi's install base were as small as S3's, but it's not, and it's not like the FX and GF6800 series where developers had to code for the differences more than the similarities. If left to coding to the default, they'll be fine.

Quote:
compatibility/drivers, CUDA, etc.


Drivers are a non-issue, and CUDA is something few people need, and those that do know specifically why they do. To most other people it's a Buzz word they heard from someone else, and they mistake the end products with the tool.

Quote:
I usually stick with a card for a long time (hence I am just upgrading from a 7800gt that I bought nearly 3 years ago).


So you think for the games launched in 2010 and 2011 that the GTX260 will keep pace with the HD4870? Interesting theory.

Quote:
I also do a lot of video editing on my system...


Which winds up a moot point, both do that very well.

Quote:
As for the evga step up program, I would be buying the card mid September... about a month from today... when new games start showing up...


What like FarCry2, Warhead, and Fallout3? Unknowns, where the last two's base engine favour the HD4870 right now when you crank settings.
The Step-up program is a good idea, but if the refreshers are as succesful as you're hoping, then they will also cost more, so it's won't be a free upgrade in that case, although it is nice to have the flexability.

Quote:
so I'd have until mid December if nvidia refreshed the lineup with a die shrink, gddr5, and dx10.1 ....


First of all, not DX10.1, just isn't going to happen, it's a significant redesign for nV, it's not like Ati's tweak to their architecture in the HD2K-3K jump.
GDDR5 is possible, but it likely wouldn't be riding a 448bit memory interface in that case. As for the die shrink it's going to be about yield improvement on the GTX260 replacement, not a huge speed bump, the GTX280's replacement will get the large speed bump.

Quote:
if they didn't, no biggee, I'd probably end up with a Larabee chip in a year or so anyways,


Interesting how that goes against your "I'm in it for the long haul" statement a few lines up.
Going to Larrabee's not a limitation to either, but your affinity for this unknown but your resistance to a known quantity makes me wonder what the point of this thread was in the first place? :heink: 
August 21, 2008 1:23:39 AM

no your ridiculously retarded, not silly though
August 21, 2008 1:37:10 AM

dude chillax!!! lol
August 21, 2008 1:53:30 AM

I was in the same boat last week. I had always used Nvidia in the past. I was looking to upgrade and decided to take the leap to ATI. I purchased a 4870 last week and have NOT been dissapointed. No issues with drivers or installation. COD4, Mass Effect, Bioshock all look awesome. I can play Crysis at 1680x1050 with all settings on very high (except shadow detail on high) and average 25 fps. I was impressed! I know that isn't perfect, but 25 fps is very playable and it looks great! The lowest the rate dropped was to 18 and that only happened once!

There are several great cards available right now. Whether you choose a 260 or a 4870, I'm sure it will serve you well! Good luck!
August 21, 2008 1:59:43 AM

I've used Nvidia and ATI cards and have had positive experiences with both. You're crazy if you decide to buy ATI or Nvidia for some crazy notion of "loyalty".
August 21, 2008 2:01:40 AM

Nah don't be. it's all about preferences. Nothing wrong with sticking to what works for you. Some people love Asus other people hate them and prefer Gigabyte.

dannyaa said:
I want to get a new GPU for all the new games coming out in the coming months. I know the recommended card right now is the ATI series, and so I was looking at the 4870. For the same price I could also choose the GTX 260.

I have *always* had an nvidia card and I know in the past, its always worked great for me, I've appreciated their good driver support, etc. Something about switching to an ATI/AMD card makes me nervous...

Is that just ridiculous? Is there any reason why I'd want to stick with nvidia and get the GTX 260 next month instead of the ATI 4870?

My only other thought was, the eVGA 90 day step up program and the possibility of nvidia releasing a new card in q4 2008... then I could upgrade to the dx10.1/gddr5/die shrink update...?

August 21, 2008 2:01:51 AM

the 4870 is awesome:D  I really want to see what DX 10.1 brings to the table:D 
a b U Graphics card
a b Î Nvidia
August 21, 2008 2:05:04 AM

It's obvious, it brings a 0.1

As we've seen in the Olympics this can be a very big deal, or nothing. :sol: 
August 21, 2008 2:10:25 AM

lol Very funny...always with smart comments:p  haha.

TGGA whats your opinion of quad sli just out of curiosity?
a b U Graphics card
a b Î Nvidia
August 21, 2008 2:21:15 AM

Quad SLi is nice, if you can afford it and have a large enough panel/resolution to truly exploit the benefits. But the diminishing rate of return on adding that 3rd and 4th card make it not so attractive if you use even a semi-critical eye on it. Quadfire is similar, but the scaling and pricing are different.

All depends on the intended us I guess.
August 21, 2008 2:27:57 AM

fair response.

I was always afraid of Quad sli, but seeing it for myself....DANGGGGGG!

I thought it would have the problems every1 said it would have, but it didn't lol:D 
August 21, 2008 2:43:37 AM

All I have to say, forget all brand loyalty when it comes to computer components NOW and thank me later. I have never owned an ATI card before until recently when my HIS 4870 X2 came in the mail. Since then I have not noticed any differences, except that all of ATI's software actually works on Vista 64-bit, unlike nVidia. My first video card was a nVidia 5200 FX, then a 6600 GT, then a 6600 Ultra, then a 7600 GT, then an 8800 GT, and now finally my ATI 4870 X2. Brand loyalty is totally useless believe me. I have built a good few computers with both ATI and nVidia cards in them and I have never seen ANY thing that indicates that either company is better than the other. Also the 4xxx series will very likely perform better than any series from either company in future games. This is not because of DX10.1, though that helps, the 4870 is just stronger in certain areas that are becoming popular with game designers. Look at GRID, it uses massive amounts of HDR and most new games seem to be following in GRID's wake, and the 4xxx series dominates in GRID. That said, this could all change very suddenly which makes it nearly impossible to really predict what will perform better in the future, though the 4xxx series is showing some real potential. It all depends on what you end up wanting, the step-up program IS a good reason to go with nVidia, if only for EVGA. I say price both of them online and buy whichever is cheapest, it is as simple as that.
a c 88 U Graphics card
a b Î Nvidia
August 21, 2008 2:53:47 AM

L1qu1d said:
fair response.

I was always afraid of Quad sli, but seeing it for myself....DANGGGGGG!

I thought it would have the problems every1 said it would have, but it didn't lol:D 



^+1

Problems? What problems? ;) 
August 21, 2008 2:55:30 AM

Here is my opinion on this as a "newbie". It is ok to like Nvidia or Ati. If you want to build the latest, greatest, fastest system however you must put aside Brand preference and go with the hard facts to achieve your goal. In your situation, sounding like you lean towards Nvidia, you have used a 7800GT for 3 years, the good news is, just buy an 8800GT now. The card came out around $300+, now you can grab a new one for right around $100, I say stick with the nvidia if it makes you happy, but don't buy the latest Nvidia(G200)...... because ATI is beating them right now. I like Nvidia myself, and run 8800GT sli, but I won't buy the newest Nvidia Cards...
maybe late next year when a game comes out that will need more than the 8800's. jmho & $0.02 (... I use a 19" monitor....... one day I shall get a 24 incher!!!)

August 21, 2008 2:59:48 AM

how sweet would it be to have a gig of GDDR5 on a 448 bit bus? Can anybody calculate the bandwidth of that? Im kinda lazy. If NV made a 260 like that i would totally buy it.

oh and i like NV because im using one right now. When i use my other PC im a huge ATi fan. I have schizophrenia.

i also really enjoyed TGGA's first response.
a b U Graphics card
August 21, 2008 3:33:35 AM

dmplus said:
+1 +1 +1

The only people who lose are those that are fanboy's and blindly buy based on a brand or "feeling".

In your case both cards are good, but personally I think the ATI card is a little better so i would get that one.

I have owned both brands over the years. I buy the best card for the money period, I don't waste dollars on feelings or good vibes from a company.


Let me rephrase that "feeling" thing with PureHD: I don't like the way the video is de-interlaced on play. I don't like the way the nVidia Control Panel shows stuff to me. I don't like the way nVidia puts it's options trying to be "advanced" when it's the same as in "basic". I don't like the way the Image/Video is decoded to the TV screen (image quality itself). I feel very displeased with overall interaction with nVidia software (not drivers).

Don't be so literal with words, but can't blame you though.

Anyway, the state of "being on top" for most gamers is "having the best performance", but remember that Video Cards are not "Gaming Cards" alone. They also offer a wide range of "things" to choose from (not being the "pro" line - Fire and Quadro). Software is part of the deal, always remember that. Useful/meaningful or not, it's up to the buyer.

Cheers!
August 21, 2008 3:38:55 AM

It is true nVidia software is useless...
August 21, 2008 8:23:05 AM

In regards to CUDA, does anyone know what Snow Leopard is being coded to take advantage of? Would an NVIDIA CUDA card benefit performance in Snow Leopard next year?
August 21, 2008 8:23:43 AM

Well I m not a fanboy of ATI althrough my last 3 cards was ATI based...because of a bad expicance with Nvidias FX >.< I know things have changed, both has advantages and disadvantages...one of ATIs is that thier drivers are young but they will keep getting up dated and will get better....Nvidias driver are not up to standard so I hear ?... Nvidia cards are too expensive for what they really are and the performances gains.... ATI apparently has better image quality particularly when AA is consorted I was reading something about that...I can't remeber were it was from.. however the main disadvantage of ATI HD4870/50 is its heat issue although its not quite a issue for the card because it was tested to extreme heats, this problem maybe the reason why ATI cards don't over clock too well, unlike Nvidia cards were you can overclock the hell out of it.
Hope that helped
August 21, 2008 12:16:27 PM

Once quad SLI and quad fire start looking better on a PC screen than looking better is components plugged into a motherboard, I'll get interested.
a b U Graphics card
August 21, 2008 12:33:38 PM

Ive owned both, used both expensively for various things, found in various things, ATI to be superior, but gaming? Depends at the time. I find it actually relieving and nice to switch from one to the other. Its like going home and remember when heheh. Get the best you can afford, whether its quad sli or quadfire. Or even a measly GT. I prefer ATI, but have loved my nVidia cards performance in gaming too. Open it up a little, then you can experience both. Its like forming an opinion of food without eating it, you may not know what youre missing
August 21, 2008 12:35:48 PM

I just upgraded from the original eVGA 8800GTS 640mb to a Visiontek HD4850 and I love it. If nVidia comes out with a better card for next year... I may buy that, but for right now today, ATI seems to be top of the heap. (Or if not officially top... it's a damn close tie)
August 21, 2008 1:23:50 PM

I'm another enthusiast here that's made pretty extensive use of both brands of card. And yes, "silly" could be a term that could be used for having a preference for a particular chipmaker.

It's fine, though, to hold a preference to a particular PRODUCT over another, or feature, as long as you're consistent with that product or feature, since those are tangible things that actually MATTER. But on its own, there is nothing that makes nVidia's GPUs inherently different from AMD's. As such, because there is no inherent difference, there is no truly rational reason to favor it. While there may be some things that they have that are company-wide, they still are not constants, changing with time; one clear example is drivers. Both ATi and nVidia have, at different points, made some particularly horrendous drivers; nVidia's earlier DX 10 performance comes to mind, for example... But right now, they're about even in that department.

However, this is not the case when you switch from the GPU's designer (which is all nVidia does; TSMC fabs the chips, and board partners make the actual cards) to favoring a specific board partner; those tend to have inherent traits. Mainly, what policies they hold, such as their long-standing warranty policy, which varies depending upon the board's manufacturer alone. A real big one here is eVGA's "step-up" program. Yes, this can be something that would be a rational factor to include in one's decision.

Aside from that, it's a matter of finding a SPECIFIC model of card that does the job well; just because nVidia's 9500 GT cards are good doesn't mean that they're going to be the right maker for high-end setups for you, and likewise, just because the 4850 is the best bang for your buck out there right now (possibly contested by the 8800 GT) doesn't mean that AMD would make the best choice for a sub-$100US gaming card... In neither case, does the performance of a particular card matter for how good a card in a different segment is, since you're still comparing two separate models of card.

Since we seem to be discussing the fairly high-end cards here, right now, there is little justification for nVidia's existence in this sector. Potentially, if someone plays at high resolutions but UNLIKE most high-end gamers, prefers AA off, then logically the GeForce 9800 GX2 would make a good choice, since while its performance with AA leaves something to be desired for a cost a bit above a 4870, without it the card scales rather well to higher resolutions. Aside from that, though, most users looking to spend more than $170US or so are best suited by a 4800 series card. (or two, depending on their budget)

As far as nVidia's future cards go, don't bet that they'll really make major tech advancements for GT 200b. As far as can be told, it will offer nothing more than just a smaller die, with nVidia suggesting that it won't offer DX 10.1, (which they still propose is pointless) nor would it offer GDDR5 support, since in most games it appears that the GTX 280's performance is limited by the stream processors, and not by memory bandwidth at all. (as also hinted at how it appears that the 4850 with only 64GB/sec of bandwidth is not limited by it, as the 4870's performance improvement matches the core power increase)
a b U Graphics card
a b Î Nvidia
August 21, 2008 3:29:31 PM

dannyaa said:
In regards to CUDA, does anyone know what Snow Leopard is being coded to take advantage of? Would an NVIDIA CUDA card benefit performance in Snow Leopard next year?


Knowing Apple probably OpenCL.
Anywhoo, it's not Apple that codes for CUDA so much as CUDA being adapted to run under OSX environment.
Likely very similar to making it work under Linux.
August 21, 2008 4:40:54 PM

For future games it's all dx 10.1 // required to go to dx11. For right now it's older dx9 and most new are dx10.1. Not to mention omg ya gotta have wishta? Are they gonna dx10 for xp? I wish.

Nvidia doesn't do dx 10.1; they do dx10.o. and the phizzy thing = pass.

4870 is new, fast, fat, do anything, some really well.
- like AntiAlias -
Early driver reports were the usual new-release chaos; been updated several times.

Look into what you get and what's useful to you, now or later. The various manufacturers can build in little differences to the reference designs. Look at e.g. Powercolor - they and others might release card with xtra mem or oclox or cooler options.
ALSO option re Hi-Def, HDMI, DVI, etc. /hardware and software /connectivity and capability and performance.

I am looking at buying ati HD - 4850 and reserving mobo realestate for the outside chance on a 2nd 4850. /ref: amd-790GX /SB750 chipset.

and
As for Brand Loyalty = I am an AMD fanboy = period.

BUT - cos they are not spintel, i feel sorry for nvids problems right now w recall, cos they are partners in the spintel wars - and nvid and ati have been hard competitors for years - ati has regained 25ish%+ market share since 4800series - i wanna see what nvid & ati do to laribeee - spintels latest market hype re onboard vid.
August 27, 2008 4:41:22 PM

OK, so it seems to make no real difference between ATI/Nvidia *overall* - each has give and takes, some things better, other things worse. And right now, ATI's card is superior in performance and features. SO, I am leaning towards the 4870 right now.

My only remaining questions are:

- Games that say "best on Nvidia" - how true is this really? What makes it play better on "nvidia" other than Nvidia payed them to put that logo?

- OpenCL and harnessing GPU for video: While I am upgrading my card with gaming in mind, my responsible side knows I use my system more for video editing than anything else. And the future in video stuff is GPU acceleration for video encoding. My main concern is that Nvidia will have the upper hand here due to CUDA. But then someone mentioned OpenCL... so...

----- Does ATI support OpenCL in their current 48xx series?
----- If so, is OpenCL likely to gain wide support? Just as wide as CUDA, or wider?
----- Am I likely to miss out on GPU acceleration for video stuff by not going with Nvidia due to CUDA?

a b U Graphics card
a b Î Nvidia
August 28, 2008 5:09:12 PM

dannyaa said:

- Games that say "best on Nvidia" - how true is this really? What makes it play better on "nvidia" other than Nvidia payed them to put that logo?


As true as Coke or Pepsi's Slogans.
Oblivion is an nVidia title, yet at launch and shortly after with the launch of the next round of card Oblivion favoured Ati's design.
Same works the other way around for the titles that have ATi's logo, it does not guarantee that that is the best performing card for the game.

Quote:
- OpenCL and harnessing GPU for video: While I am upgrading my card with gaming in mind, my responsible side knows I use my system more for video editing than anything else. And the future in video stuff is GPU acceleration for video encoding. My main concern is that Nvidia will have the upper hand here due to CUDA. But then someone mentioned OpenCL... so...


Cause BadaBoom was such a success? ;) 
It has little/nothing to do with CUDA , that's like saying a game is better because it was compiled on and intel or XP platform rather than an AMD Vista platform. It's the end product that matter, not CUDA. All CUDA is is a developer front end interface to access the raw GPU processing power, ATi also uses Brook+, CTM/CAL to do the same thing, the difference is on the developer side. OpenCL and DX11 give people a standard platform like OpenGL and HLSL/DX right now.
Anywhoo, both will be doing mainstream video acceleration, both currently do accelerated encoding, but the tools are just starting to arrive and the best are pro/paid for tools (even BadaBoom's really useful version is a pay version).
The plug-ins for Apps like Adobe and Power Producer are really the are most people will be interested in.

Quote:
----- Does ATI support OpenCL in their current 48xx series?


Yes, and the HD3800, 2900, X1900 series and nV's GF8, 9 and GTX (not sure about GF7, because it is very different and very weak [GF8400 would outclass it usually]).
However it will depend on what you're doing with it and level of, like OpenGL, you can support various features under it, it will be a platform to build upon like OGL and DX currently are.

Quote:
----- If so, is OpenCL likely to gain wide support? Just as wide as CUDA, or wider?


It'll most likely gain wide support. How much support compared to CUDA depends on adoption by developers, but right now, AMD, Apple and Intel all support OpenCL, nV supports Cuda of course and Microsoft suppor DX11 of course with AMD and intel supportive of the idea of DX11's compute shaders.

Quote:
----- Am I likely to miss out on GPU acceleration for video stuff by not going with Nvidia due to CUDA?


No, and they've been available on both and will continue to be available on both in the future. Regardless of whether they use one two or two, companies like Adobe aren't abot to cut-off their market share with supporting just one or the other, they will offer consumer plug-ins for both. some companies like Sony are waiting for the dust to settle, so you won't know their stance until later.

The thing is that this is just like the HD-DVD / BluRay debate/debacle, it had little to do with technical merit or utlility, but pretty much everything to do with marketing and consumer adoption rates (people decalring one the winner and the other dead and switching next quarter/season right up until the end).

Right now GPGPU is early for consumers, but going through it's 3rd gen growing pains for those of us using it professionaly (mine's boring number crunching for mapping software [we're split havnig just added a ton of HD2400s to everyone's standard generic platforms for a few of the tools we use, and a bunch of Quadros for dedicated workstations for another tool we use]).

The thing is for most consumers, focus on what you use it for near term, long term you can always upgrade or even just add a second hand GF8600GT or HD2600Pro card when you want to use one of those tools (easier under XP than Vista due to driver issues).

Personally I would say that you shouldn't worry about it as much, the truly big app house like Adobe will use both, and then for the smaller apps, pick them one they've proven their utility in what you want to do. Right now it's like throwing darts at rotated dartboard with the numbers taped off, hoping that you hit a good number once the tape is revealed. It's still way to early to know the best pick for sure.

Both have advantages and tradeoffs in the present and future, the best thing you can do is research the features they currently offer you an buy for that, while hoping, things are good for you in the future too.
August 28, 2008 6:33:01 PM

Great Ape, thank you for the excellent and thorough response(s).

Just one point of clarification:

If developers code/compile on CUDA, from what I've read, it will only work on Nvidia hardware. Correct?

However, if developers code/compile for OpenCL or DX11, it will work on all GPUs?

So basically, if CUDA gains wide support and developers code on that, Nvidia cards will be needed to accelerate those apps? And, since Nvidia-exclusive apps is less desireable to developers who do not want to cannibilize sales, most apps will most likely not go this route?
August 28, 2008 8:42:47 PM

dannyaa said:
- Games that say "best on Nvidia" - how true is this really? What makes it play better on "nvidia" other than Nvidia payed them to put that logo?

Not true one bit. As TGGA mentioned, they often actually perform better on ATi's cards rather than nVidia's. Two good examples are Unreal Tournament 2004 and ESPECIALLY Oblivion, at least on the Radeon X800 cards common at the time, where an old X800XT could typically best a much-superior GeForce 7800 GTX in that game.

What makes a game an "nVidia Game" is simply that nVidia paid the developer (or more often publisher) money. In other words, it's an advertisement, nothing more. Ditto for similar adds for Intel's CPUs on games, (especially in the era when in spite of "plays best on Intel," Intel's Pentium 4s were getting whomped by Athlon64s) and in the few rare cases that ATi/AMD has actually advertised their hardware on a game.

dannyaa said:
----- Does ATI support OpenCL in their current 48xx series?
----- If so, is OpenCL likely to gain wide support? Just as wide as CUDA, or wider?
----- Am I likely to miss out on GPU acceleration for video stuff by not going with Nvidia due to CUDA?

Yes, the 4800s have OCL support. Personally, given that it has a wider backing of major companies while CUDA has pretty much just the support of nVidia.

Even in the unlikely scenario that CUDA may wind up dominating the GPGPU scene, (rather than going the way of physics cards) it'll be a number of years before the dominance actually means missing out on anything.

dannyaa said:
If developers code/compile on CUDA, from what I've read, it will only work on Nvidia hardware. Correct?

However, if developers code/compile for OpenCL or DX11, it will work on all GPUs?

So basically, if CUDA gains wide support and developers code on that, Nvidia cards will be needed to accelerate those apps? And, since Nvidia-exclusive apps is less desireable to developers who do not want to cannibilize sales, most apps will most likely not go this route?

Correct, CUDA only works on nVidia's hardware. And your conclusion on developers opting not to take the nVidia-exclusive rout is also logically correct. That's the main reason CUDA might fail. OpenCL has the support of AMD, as well as Intel and Apple, which means that Larrabee should be compatible with it.

And of course, DirectX 11 should be compatible with all cards, since DirectX already has an effective monopoly on games.
a b U Graphics card
a b Î Nvidia
August 28, 2008 8:53:15 PM

dannyaa said:

If developers code/compile on CUDA, from what I've read, it will only work on Nvidia hardware. Correct?


Yes, currently that is the case, but as it is a front end for accessing the hardware they could change that (although unlikely now). If you made it into generic libraries that were not architecture specific, you could write somethig in CUDA that would work on ATi, intel and VIA/S3, it just depends on what you target.
But it wouldn't matter, because you wouldn't be writing an Adobe plug-in, so much as an nVidia plug-in for Adobe, and then you would use something else, like CTM or CAL to write an ATi plug-in for Adobe. Think of it as the programming language for the hardware, because what CAL/CTM and CUDA essentially do is let you easily program for the components that make up the hardware. This function will be taken over by Open CL and DX11.

Quote:
However, if developers code/compile for OpenCL or DX11, it will work on all GPUs?


Yes, because they will work to making it a standardized unti and function. Hardware will also be meant to work towards a unified input/output design, similar to DX10's stream processor standardization (which is still very limited to data that is very vertex related in nature).

Quote:
So basically, if CUDA gains wide support and developers code on that, Nvidia cards will be needed to accelerate those apps?


Somewhat but not as you're thinking. It also still wouldn't lock in the app to one method, other methods could still be used, it would just depend on whether or not it was easier to add support for nV hardware or another IHV. For the major applications I know first hand that ALL 3 of the major vendor are helping developers intergrate their options into the apps.

Quote:
And, since Nvidia-exclusive apps is less desireable to developers who do not want to cannibilize sales, most apps will most likely not go this route?


Yes, exactly. IMO it would be the smaller niche apps that would benefit from a simple intergration assisted by one IHV if they got alot of help early and before other apps. However, I also think that it would be those very apps that would prefer a unified standard as the long term approach since if you're only selling 100,000 copies a year, do you really want to make people chose based on their hardware, because just as easily they could chose another app from the big player in that market (like Adobe) instead of chosing another piece of hardware to run the niche app, thus you lose sales.
I think it would've been easier to sell a specific solution a year or two ago (had AVIVO actually been hardware specific other than as a device_ID check) then you could've leveraged that benefit when one solution had multiple times more power and a distinct advantage. Now you have 2 very capable platforms with a 3rd coming, and the benefits of balkanizing your own app would be much less and more likely to hurt than promote sales.
If Ulead or someone had come out with a GF7 or X1K-only encoder that was high quality and high speed 2 years ago, I wouldn't be surprised if they had doubled or tripled their sales (or Adobe would've solidified theirs), but now, just about everyone is looking at it, and everyone is proposing it, so narrowing your market would just give the competition the ability to market their solution, especially if you off not only A, but A+B+C. The time for exclusivity IMO has passed.

Take a look at the HD-DVd / BluRay issue as an example, how much does PowerDVD and WinDVD really care who wins as long as it doesn't cost them too much and it means more sales not less. These companies are not here to help the IHVs, they're here to sell software, and if they aren't selling them to the competition's hardware someone else will, and if they sell to both, then you risk losing all your customers.

No one really likes being locked into a single solution.
a b U Graphics card
a b Î Nvidia
August 28, 2008 8:55:18 PM

HaHa, had a Fire Alarm here at work where I'm a Fire Warden and stopped mid-post, came back to complete post, NTK already replied. :o 
August 29, 2008 6:48:17 AM

Wow, great replies. Thanks all.

Looks like the 4870 it is. Any specific recommendations on a vendor? I know it all comes down to warranty/bundle, but many ppl prefer eVGA on the nvidia side. Any strong recommendations for an ATI/AMD brand?

Also, I've read the 48xx series runs a bit hot on stock cooling. I'm thinking of picking up a 3rd party cooler for it, any recommendations on what to look for? Something that will keep the heat down significantly but not increase the overall cost too much?
a b U Graphics card
a b Î Nvidia
August 29, 2008 7:01:24 AM

HIS & Visiontek are my two preferred ATi makers (Visiontek has great warranty), I'm interested in seeing what Gainward does with ATi they were great with nVidia.
August 29, 2008 7:50:23 AM

Proud owner of XFX GTX 260 xxx edition.
Specs very close to GTX 280.
I have this card about 1 month and i am very very happy.
I suggest you to buy this card. Its about 15-20% better from the 4870 wich is also a good card.

P.S. sorry for my poor english
a b U Graphics card
August 29, 2008 8:09:16 AM

Not if you overclock the 4870.
August 29, 2008 3:34:59 PM

True, but some people who buy $2000 worth of computer parts and $300 for a GPU don't want to overclock.
; )

not many really.
August 29, 2008 11:25:52 PM
!