GTX 280 vs Radeon HD 4870 x2

darktravesty

Distinguished
Jan 7, 2008
35
0
18,530
Well, I saw the news about the Radeon HD 4870 supposedly beating out the GTX 280 today in benchmarking software, so here is my question: a this point, money being no object (well, within reason, $1000 is too much), what card should I go with? Up until now I had been betting on the GTX 280, but I'd really hate to fork over the cash and get a $650 card that underperforms. Another problem is that I am under a bit of a time constraint, so probably need to get a card soon, meaning that the Radeon HD 4870 may be out of the question anyway.

Thoughts?
 


Why not wait until the actual cards appear, if you can wait until the HD4870 appears, then wait, if you can't then get the best at the time you MUST buy.

Cause right now I'd say go with the S3 Excalibur, it's gone be KickA$$ Yo, like the name is so P1MP.... errr.... whatever. [:mousemonkey]
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790


I still believe that the GTX280 is going to be the faster card. By how much...I cannot say, but if you are under a time constraint then I would just get a GTX280 as soon as you can, but I would still wait for some benchmarks of the card to come out before you make your decision. When they start to come out...come back here and ask again if you are still unsure. It will be easier for us to give you our opinion that way.

Best,

3Ball
 
Patience.
Until cards are available, you can't do anything. When they show up, there will be some real benchmarks in games, not speculations. When both cards are available, the price/performance will be similar, or the underperforming card won't sell. If you need the fastest, wait until you can compare them both.
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790


This guys a thinker...I like it! I say...run with it! ;) lol

Best,

3Ball
 

pcgamer12

Distinguished
May 1, 2008
1,089
0
19,280
Wait for both releases. Read real benchmarks, best off a site like HardOCP.com where they do worst case scenario testings so you know you get good frame rates throughout the whole game. I.E. or for example with a 8800GTX you don't get 34FPS at 1680 * 1050 throughout Crysis, it dips to some single digits sometimes.
 

area61

Distinguished
Jan 8, 2008
280
0
18,790


whats an s3 excalibur???? :heink:
 
I wouldn't trust HardOCP, there's a little too much anti-Red pro-Green in Kyle's playhouse*.

IMO, Check Beyond 3D (they don't compare RED vs Green in the normal fashion, very technical Reviews), The Tech Report, Xbit Labs, ComputerBase.de, DigitLife, and FiringSquad for a good spread of tests (resolutions, settings, and games) for any new card... oh and of course Toms'. :D

Anywhoo the more tests/benchies the better IMO, don't limit your information, even look aat those notoriously biased, just be aware which ones have a history of such things like omission or 'errors'.

*See Kyle's statements about the R600's power consumption long prior to launch and see if that holds up for the GTX280.
 

roadrunner197069

Splendid
Sep 3, 2007
4,416
0
22,780
LoL I cant wait to see the whiners that buy 280s for $600 and then cry a week later when the much cheaper 4870s whoop it.

Nvidia should have never pissed in Intels cheerios. If they would have leased SLI to Intel SLI would be lots better then it is. Crossfire X is about to be the next best thing since sliced bread.
 

sciggy

Distinguished
Apr 15, 2008
318
0
18,780
if the 4870 is as good as some people are saying then I'm glad I got an x48 board to crossfire those babies :) Just a few more weeks till I can upgrade this friggin 7900gt *twiddles thumbs*
 

darktravesty

Distinguished
Jan 7, 2008
35
0
18,530
I'm just finding it difficult to believe that the 4870 is -that- much better than the GTX 280, let alone better at all, but I'm not terribly educated on the matter.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
Unless the game designers get sponsoring from ATI, the status quo of Nvidia cards outperforming ATI cards in games is likely to stay as is.

But if the ATI cards do outperform g200 series, than its great news for everyone cause they are pretty inexpensive in comparison.
 

Ogdin

Distinguished
Jun 14, 2007
284
0
18,780



Until some real benchmarks come out its all guess work/smoke and mirrors on who is faster than who.
 
G

Guest

Guest
first off we are talking about the 4870 X2? right? the 4870 is supposed to compete around the 9800 GTX...
.
anyways if it performs better than the gTX 280 then thats amazing... first off most games are sponsored by nvidia meaning they are more optimized for nvidia... and if ATI is better performing in their non-sponsored game... just imagine how well they'll perform in an ATI sponsored game
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


Outperforming is relative. It depends on the game, the resolution and the CPU. Many benchies are done with Intel quads, especially extreme edition. Since CPU limitation is a factor, people with midrange CPU's don't always get the exact same result as the benchmarks.

What they do get from benchmarks is an idea as to which card is best for which game. Though the game I'm playing most now is an Nvidia TWIMTBP, that doesn't mean all that much in terms of performance.

Whether Nvidia gave the developer's free green cards, or whether they simply paid up front to have access to code so they could optimize drivers upon release, it doesn't mean that much once a game's been out for a month or so. Nearly every game needs driver optimization post release from both Nvidia and ATI.

IMHO, ATI is better because they have the best price performance in games and the better visual quality in video. AVIVO beats out Pure Cinema in price and in video performance.

But if the benchmarks at HC or any other site shows 3, 5, 10 fps difference in the game you like at the resolution you prefer, then by all means go Nvidia. If you don't have an equivalent CPU, you may not get the exact same results.

I'm sure that there will be some optimizations that give the GTX280 the edge over the 4870, though the 4870x2 will probably counter Nvidia's high end at a much better price this fall.

I'll go 4870x2 when I get a better CPU and motherboard. I'm looking forward to it. To those who go GTX280, then I say simply "Enjoy".
 
Speaking of cpus and gpus, consider this. If some truly demanding games come (other than Crysis), and you need that X2 or sli performance, Ill remind everyone, and we will be seeing this, and questions like it, sli is currently a AMD cpu scenario,or nVidia chipset, while the X2 can run on anything. There wont be a G280x2, as we all know, so anyone thinking of getting the best, keep this in mind before you buy
 

dtq

Distinguished
Dec 21, 2006
515
0
18,990


My final decision wont be made till I see the benchmarks on release from both manufacturers. But its looking quite likely to be a GTX280, due to that game that I play most often that gains nothing from crossfire or sli :fou:. Thats just going by rumoured performances though and the expectation that a single GTX280 will heavily outperform a single 4870

The motherboard I have is crossfire not SLI compatible so I'd heartily welcome a killer ATI single GPU card. I would like a good 50%+ performance increase over my 8800GTX from a single GPU, if anyone can provide me with that they will get my money. I would welcome the opportunity to see first hand the ATI video difference as every movie I watch is on a 8feet wide screen powered by nvidia, I have no complaints over image quality, (its enough to drop most peoples jaws when they see something like star wars on it) but would like to see what the difference is for myself. I suspect that a new projector would do more for me than a new graphics card though. I quite fancy going blu-ray and Native 1080 on the projector, but I spend more time gaming than watching movies so new graphics cards come first.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


Well, Pure Video's improved a bit since Tom's or Anandtech did reviews in 2005 through 2007. Right now, Nvidia handles picture in picture HD content better. That would work for those of us who watch on our monitors.

Still, ATI has better image quality:

Media Enthusiast Observations:
If you expected both images to be similar then this is where the surprises start. There are clearly major differences in image quality during this scene. In the ATI image we can see far more detail in Eva Greens hair where as on the Nvidia image it is too dark to make out much of the texture. The same can be said of Daniel Craig’s jacket; on ATI the collar is visible where as the Nvidia image shows no definition between jacket sections. In terms of skin tone, the Nvidia image is clearly darker with more reddish tones where as the ATI is lighter with more of a grey/green effect...

...Overall we can gather the following from the above information. While Nvidia's image looks more saturated in fact it is not. ATI are using higher levels of cyan content to create the illusion of a more natural image and to be fair, it works very well. Nvidia are using a redder hue to the image however rather cleverly it is not effecting black or neutral tones and is clearly a decision on their behalf to present a softer, warmer image.

Personally I prefer ATI's image especially for detail in the mid tone to black range, however I have a feeling that Nvidia's image would look more vibrant and full of life on a poorer screen.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=6

A few pages earlier, in regards to CPU usage and picture in picture:

Not much has changed with video acceleration performance since we last compared ATI to Nvidia. The 3870 still has the edge with CPU usage, especially in VC-1 encoded titles but Nvidia still have the feature advantage with the ability to play back high-definition content with Aero enabled...

...Nvidia also have a great feature called Dual Stream Decoding. Using Resident Evil Extinction (AVC codec) as an example, the 9600 will accelerate the movie and ensure that Aero is still being used. The cards will also accelerate a secondary stream. A good example is the picture in picture special feature which can be enabled when watching the movie. On ATI hardware when using picture in picture the average CPU use takes a huge jump from around 3% to 14% whereas Nvidia hardware stays at less than 8%. These figures really do make us question ATIs apparent dominance with video playback.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=3

More on image quality:

Media Enthusiast Observations:
With this outdoor scene we see that there is a significant difference in overall brightness between the two images and the result is that once again Nvidia lose much of the finer detail in areas such as dark hair. There is less of an impact in the rest of the image and Eva Greens skin and dress appear to have a much warmer tone on Nvidia, although it is hard to say which version of her skin is most true to life...

...There is no doubt in my mind that ATI are displaying a technically better image in this scene, however on the screens I tested both look rather good.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=7

More on image quality:

Media Enthusiast Observations:
In our final Casino Royale comparison we see that once again there is much more detail in the ATI image with the right side of Daniel Craig’s face being significantly more visible. On the Nvidia side the overall image is more alive and the skin tone gives the impression that the actor has really been through an action packed adventure in a hot and humid country. It is just more real.

So which is better? Well again first looks can be deceiving. The ATI frame immediately looks more neutral, but again this is due to the fact there is a good 10% higher cast of cyan throughout, which while in some instances will work well does not when there is a predominant face in the scene. A natural skin colour has almost no blue content, especially in a warm jungle setting and this would tie in with Stuart's comments above that the overall Nvidia image is "more alive". That said, after analysing Daniel's face, we found that the ATI solution had only a 2-3% higher content of cyan tones so clearly there is some clever coding behind the scenes, analysing the skin tone and reducing the cyan saturation. Still overall, the technical results are in favour of Nvidia.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=8

Background environment image quality:

Media Enthusiast Observations:
When we move away from Casino Royale’s more human based content and change to the more scenic Planet Earth BBC documentary we see that the differences in image quality occur regardless of the content used. In this particular image there are three main areas we highlighted as important. Firstly the Sky, on ATI hardware it is quite washed out where as on Nvidia it seems more real, a better shade of blue. Next when we look at the level of detail in the darker area at the base of the waterfall there is clearly more detail to the ATI image. Finally on the foliage throughout the scene is more lush and green on Nvidia than it is on ATI...

...Overall, both images are excellent, however on a technical standpoint I would give Nvidia another point for this test as their image is purer, although the sky tones are slightly flat.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=9

More background image comparisons:

Media Enthusiast Observations:
When watching the Grand Canyon section we see that the cliff face on the left of the screen is more impressive on ATI, in particular the small cave area has much more detail visible. We also feel that the colour of the rock on the left is slightly more realistic on ATI. On the Nvidia side we feel that the lower sections of the valley have far better definition to them, especially in the distance. It is also noticeable that the sky is different on the two images, if we had to choose a preference it would probably be ATI’s version as the Nvidia one seems slightly too blue.

Expert Observations:
Taking a section of the sky and analysis the colour components shows again that ATI are using more magenta to give the impression of richer colours (Cyan 27%, Magenta 14%) while Nvidia have 27% Cyan and only 1-2% of Magenta. This would tie in with Stuart's comments regarding the Nvidia sky being 'slightly too blue' as without the Magenta content the sky would appear to be lighter and much colder. Glancing over the scene the Nvidia image is slightly more 'blasted out' (a term to signify less detail in the highlight areas), as you can see if you look at the stream meandering down the centre of the image.

Section 2, a shadow area on the rock face shows a failing of Nvidia rendering with a much higher saturated image - in direct comparison to the ATI image there is 15% higher black content and moderately higher yellow content. Adjusting the black curve in Photoshop allows the detail to return slightly, however ATi win this test on almost every level.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=10

Underwater scenes and fish:

Media Enthusiast Observations:
Our final image looks at an underwater scene and there are two areas we feel are worthy of singling out. The first is the darker fish, towards the front and left of the image. On ATI its scales/markings are much clearer where as on Nvidia we again see detail suffer. Looking at the bright yellow fish near the centre of the image we find that the Nvidia image actually shows it to be richer and the stripes better defined. The ATI image of the fish is a little too washed out for our liking.

Looking across the rest of the image we can see that the overall cast for the Nvidia solution is Magenta biased apart from the blue ocean background. Both manufacturers are displaying the image slightly differently and in the end it will be up to the viewer to ascertain which is the 'better' image. Technically they are both very good, just slightly different.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=11

Nvidia's Pure Cinema adds new features to match ATI but they aren't enabled by default:

With the 9600 and 9800 series of graphics cards Nvidia have added two new video features to PureVideo HD. Dynamic Contrast Enhancement takes each frame and calculates enhancement values for that frame which is a change from older methods of contrast enhancement which apply the same changes to every scene, regardless of whether enhancement is required. Colour Enhancement analyses the tone of an image and dynamically adjusts it for each frame. The example Nvidia use to describe this is "skin tones changing from flat to vibrant".

We decided to test these options and the results follow, it should be noted that we also enabled the noise reduction feature in the Nvidia drivers and set it to a level similar to ATI. (ATI enable noise reduction by default). In theory, this should represent the best image quality Nvidia has to offer, without changing edge enhancement values. When we asked ATI what settings we should use for testing, 'default' was the response, so we can assume that the screenshots produced so far represent their best image quality...

...Unfortunately with this enhanced algorithm there are also some rendering issues as you can see in the zoomed analysis above. We are seeing stepped bands of pixel discoloration within certain boundaries, this was noticed by Stuart above and I feel this is to do with the changing of the black content in the attempt to brighten up the overall impression of the image.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=12

ATI renders more images slightly better than Nvidia:

Media Enthusiast Observations:
In this scene the changes to the Nvidia image are beneficial in terms of brightness but the ATI still wins on hair detail. We were also slightly bothered by background colours again on the Nvidia frame. The brightness has a real impact and in specific sections (for example the blue/green hills across the water on the left) they end up very un-naturally coloured on Nvidia's enhanced image...

...Pictured above is the selected area from the red circled analysis in the prior image above (bottom left of screen). This time however the image is converted to CMYK and the yellow channel is isolated from the other colours. It is quite clear to see the noise within this yellow channel and its a clear indication that the Nvidia algorithm is somewhat struggling to deal with this image.

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=13

They conclude:

We were quite surprised with the results of Nvidia's colour/tone enhancements also; these features have the ability to greatly increase the detail levels but most of the time this comes at a cost with other areas of the screen becoming overpowering and unnatural. We firmly believe that features like these can only be considered a success if they can be left enabled at all times, after all the last thing you want to think about when sitting down to watch a movie is “Do I want to chance the image quality being impaired?”. Therefore we cannot recommend any end user enables these options until they have evolved a little more.

So this leaves us with Nvidia default quality against ATI default quality, and the performance or features that both manufacturers offer. In terms of performance and features we have to say that Nvidia is the clear winner as they currently allow playback with Aero enabled and good CPU usage statistics regardless of whether the image includes one stream or two

IMHO, ATI wins their test, even though they prefer Nvidia. CPU usage isn't all that great with ATI when Aero is enabled (who watches picture in picture anyways?), and they are comparing the latest Nvidia card with a prior generation 3870. We will see if this holds up when the 4850 and 4870 arrive.

ATI's image quality, except on one face scene in their review, is better than Nvidia's at default and Nvidia's fix for image quality actually causes more problems than it solves. Both card settings probably need tweaking but I've used Pure Cinema in a prior generation with a 7600gs and used AVIVO from the old All in Wonder cards to the 3870 generation and found it much better.

Plus, last time I checked, Pure Cinema costs an additional $50 or so direct download from Nvidia, whereas AVIVO is free and included in Catalyst releases. ATI began as the second best gamer GPU company but was always oriented towards video with gaming quality not far behind.

Considering the "tweaks" Nvidia's made at times to their drivers just prior to a game's release (Crysis demo's water fudging), I'm not sure that Nvidia wins hands down nowadays as the best gamer card. They are ahead in some titles but behind in others, ahead with some cards, but behind with others. Overall, it's neck and neck and I don't think that Nvidia deserves their rep as they've played fast and loose more often than ATI (i.e. their TWIMTBP program plus behind the scenes "tweaks" like boosting the PCIe x16 bus on certain Nvidia boards for 9600 benchies that actually led to instability overall).

Buy the GTX280 and enjoy, it won't be a bad card, but ATI's drivers get things balanced once the game's are released, and though Pure Cinema's improved over the past few years, IMHO, AVIVO still has it beat hands down where image quality is concerned.

Anyways, I hope this information helps you compare AVIVO and Pure Cinema. Hopefully, more sites like Tom's and Anandtech will update their comparisons once the new cards arrive.





















 
^ Looks like about a 50/50 deal, and the picture quality thing can be pure preference. Ny newest Nvidia card has a picture far superior to anything I have ever owned, including ATI. There are differences between ATI's picture and Nvidia, but I don't think anyone can really say that ATI has better quality. They both have their stronger points and weaker points. Just depends on which points are more important to you.

 

TRENDING THREADS