AMD vs Nvidia image diferences

Shadow777

Distinguished
Jan 13, 2014
236
3
18,695
Hi everyone, im currently in a nightmare of a struggle choosing between a Gtx 750 Ti or a r7 250, obviously, the gtx 750 Ti is much better and i found a very good offer to buy it, in my country's price its $2400 while the r7 250 is $2100, the thing is i would have no problem putting $10 more dollars into the 750Ti but ive heard that Nvidia video cards produce washed away colors and that AMD gfx cards make better, vivid colors? is this true to some extent and if it is is there any way to correct the colors on the nvidia gfx cards? thx everyone
 

rush21hit

Honorable
Mar 5, 2012
580
0
11,160
Trust me, there are absolutely no problem whatsoever with the GTX750Ti or nVidia's card in general.

I'm under suspicion that the hearsay you tell us is some scheme by your local shop to inflate the price for the AMD parts. Because that's not the price range of an R7 250 should be at.
 

Shadow777

Distinguished
Jan 13, 2014
236
3
18,695


[video="https://www.youtube.com/watch?v=fKy5ctjixaY"][/video]

you can see the diference in this video regardless of the monitor you have..
 

rush21hit

Honorable
Mar 5, 2012
580
0
11,160
Though I see your point I must remind you that both has entirely different architecture. Depends on what engine used and optimizations, the result will vary from game to game. Let's back to basic of that what matter the most for GPU in PC gaming are these question; Does gaming run smoothly? Can it sustain its smoothness most of the time? Can it maxing it out while maintain or less performance impact? And the power bill would be the least of your concern.
Regardless, The 750Ti still looks more promising.

Also, your question was the R7 250 which at lower tier than the 260X. Obviously the 750Ti will net better. Heck, even the non Ti will too.
 


I question the validity of that video. Just think, if Nivida has washed out colors, who would they sell so well? who would buy their 5000 dollar quadros if they were no good fore editing?
As it has been said, the more I think of it the more I know it is blatantly false. I have built plenty of systems, and I cannot ever remember a color difference. the last one in fact had a 980ti Hybrid. we put it up against my friends R9 280 just to see how much better it was, on the same monitor and all and saw nothing other than greater performance. we were not testing for a color shift, but there was not one or I would have noticed.
Have you considered that the color difference in that video still could be the fact that one system is marginally behind?
Also that is a FPS test video, it was never made nor does it claim to test it.
 
I see the video and did not notice any color difference between the two.

Anyway this stuff has been debated a lot and there even scientific test to validate the claim. But in the end the conclusion is both actually have same image quality (color). So why people claim that ATI (now amd) have more vivid color than nvidia? The most likely explanation was by default amd cards were using more saturated color setting. So then how about nvidia cards? In nvidia control panel there were also setting for adjusting your color. So if you want more vivid color like amd cards all you need to do make changes at nvidia control panel
 

rush21hit

Honorable
Mar 5, 2012
580
0
11,160
I'll say it again, fundamentally, both sides apply their rendering method differently. So much so, you can't even compare them as if apple to apple anymore.

Use different samples with myriad of engines out there, and you'd get different results. And it has been so since long time ago.

But to answer your question again in your OP, R7 250 are way behind GTX750Ti in everything. In disregard of what higher tier from both sides capable of.
 

Shadow777

Distinguished
Jan 13, 2014
236
3
18,695


is there any way of configuring in Nvidia's control panel the color to make it look like the 960 in this video?? cus now im even more confused haha, the 960 gave better colors than all the other cards in the vid, including the r9 290

[video="https://www.youtube.com/watch?v=WrSkRC7QvIM"][/video]

 
go to nvidia control panel. then from the go to dekstop color setting. from there you can directly control how the colors are on your monitor. and in the conclusion for Titan vs 290X they mention this:

Do you see any differences at all compared screenshot? We do not at all, not sure if you guys does, please leave a comment then. We conclude that this experiment is busted, both card render good and nice image, we did not see any different on both samples from ASUS GTX Titan and ASUS R9 290X.
 
I think this has been debated since early 2000s. In the past there might be diffrence but as the time goes the difference actually gone when everything go digital. It wasn't so hard. Just test amd and nvidia card side by side on the same monitors and see if nvidia have more wash out colors than AMD as the claim goes.
 


I just thought this up, it could be the screen capture, in fact I bet that is what it is. it makes sense enough, there is no way to know that the same capture software was used for each video, and I noticed that my R7 250X certainly appears dark and over contrasted in screen casts. MSI has their own stuff Asus does as well I believe,as do a few others.

Edit: Also there is always the fact that the color might have shifted when the video was rendered trans-coded and stitched together. sometimes you can get color fade.
 

Yes, the fact that they never called it out implies it could be a post processing thing. Or they never saw it, and therefore did not test it and have no evidence on the matter. I'll reiterate my above lines, if Nvidia was known to produce inferior colors, they would be long gone.
 
By the way, you can't compare colors/contrast/saturation and so on, on the same display even if you swap graphics cards. You have to record it and then look at them side by side, your brain will make up for what you previously saw, and blend it together pretty much. Not to mention if the room temperature changed, it won't look the same either. So can we all agree on that this video is inaccurate in terms of picture quality? I still find it odd to come to that conclusion, because of this was mentioned earlier, I'm by no means an expert on a graphics cards ability to render pixels, and how or how much it changes over time:



 

I don't know about you but I certainly notice when a monitor is off hue( I get really touchy about that sort of stuff), and if this was as drastic as these misleading video evidence based claims are, then it would be very noticeable for instance that first still frame implies that the contrast was off by at least 7-10% Also if there was a color shift due to temp changes, the video editing crowed would have thrown in the towel ages ago. When a video card overheats, the colors might go haywire, but that is around 100C.
I feel monitor and cable variances alone would override any color reproduction differences.

Where are all those spec touting video card hype maker engineers when you need them? they could settle this I hope, or just blab about performance per watt...
 
But Robert, NVIDIA is superior across the board, AMD make bad graphics cards. :p

Nope, I probably wouldn't notice it. I definitely notice skin tones though, but things like grass, and buildings? Nono.

After getting used to a DLP projector, I can't even look at my IPS monitor, so yeah, I'd say I notice the contrast difference too. None the less, where the NVIDIA staff at...
 

jeffredo

Distinguished


I'm sorry, I'm just not that pedantic/picky/anal/whatever. As I said I'm very pleased with how both of my cards look. The only time I ever noticed a display adapter looking better was when I went from my old Voodoo 5 to a Radeon 9600 Pro back in the day. The Voodoo had noticeably better image quality. Other than that - nope, no discernible difference going forward.
 

rush21hit

Honorable
Mar 5, 2012
580
0
11,160
I was trying to say that something at some point on screen would look different. Be it particle rendered differently, or fogs in line of sight appear slightly different, or textures that seem crisp from any of both sides, or foliage! Look at that foliage! The right screen seems better, I swear! Things like that. It has always been so. Since both using entirely different architecture over the years. Not to mention fundamental difference between its driver. CUDA, PhysX, Stream Processing...I barely get the hang to it, and then I stop caring. I only care results now.
People stop debating about stuff like these for some time now since everyone seems to agree to disagree to everyone's opinion.
Now we only care about FPS and how the GPU able to sustain it aside from its pricing.

Robert Cook does make some valid point about screen capture software and inferior color argument.