Hd 6950 to hd 6970

exhail

Distinguished
Oct 17, 2010
207
0
18,690
so ive just read that toms article and ive seen that you can basically change your hd 6950 into an hd 6970 with a bios flash this is really death to nvidia, I mean the whole point of the gtx 570 is to be a good single card (as crossfire is better than SLI now) it was for someone who wants hd 6970 single card performance for a bit cheaper and it has all the benefits of being an nvidia card that being physx and CUDA and so on, but now being able to change an hd 6950 into an hd 6970 with a simple bios flash and it is cheaper than the gtx 570 and it scales better in crossfire than gtx 570 SLI, now remind me why would you buy a gtx 570?
you would only do it if you were some stupid fan boy, hopefully nvidia will step up with the gtx 560 as competition is always a good thing

 

Digital Dissent

Distinguished
Feb 24, 2010
94
0
18,640

Well its death to the 570 on the enthusiasts level, those of us who know and are willing to do this (discounting fantards). Realistically though the vast majority of consumers will not find out about this and/or not be willing to try it, so nvidia should be safe for now...still, the competition is great. I almost bought a gtx 570 and didnt in the end because it ate too much power for me to SLI it in the future, and also because it lacks the advanced anti aliasing ATI has brought to the table...oh and the 6950 can run 3 monitors off one screen. CUDA is of no use to me, and PhysX is just not important enough. In this case ATI is bringing some very serious competition home, nvidias response will either be very interesting or very failed.
 
Most people are fanboys to some extent. People are pretty loyal with what works, and the difference, even after the change isn't huge.

I wouldn't be worried for Nvidia. They still have a lot of people convensed their PhysX enabled cards help them with all their games.

What disapoints me, is no one is shining a light on ATI's superior Antialiasing. When I was using 470's, I was jonesing bad for Supersampling AA on a few of my games which looked horrible without or maybe it just became a big deal to me because I had played it with supersampling AA from ATI before.
 

exhail

Distinguished
Oct 17, 2010
207
0
18,690



facepalm

ok you could buy two hd 6950s and bios flash them they will perform on par with two gtx 580s as crossfire scales better than SLI
 

adreen

Distinguished
Jan 25, 2006
447
0
18,780
the 6950/6970 are pretty much enthusiast cards, so most people buying them "should" already know about this bios flash already.
only thing is we don't know if there's any long term problems by doing this.
altho it appears the risks are pretty low, considering people are already pushing the clock speeds quite a bit higher than just stock 6970 speeds, and experiencing no immediate ill-affects.
and the flashing procedure is so incredibly simplistic with those very well written guides out there already. from what i've been reading so far, every person attempting to do this has successfully done so to some extent, the only difference is how much they were able to overclock it afterwards. worst case so far has been only getting to unlock the extra shaders, which is already a upgrade in it's own right.
I would definitely not say this is the death of Nvidia, seeing how big a powerhouse they are in the overall market. But this absolutely was something a win that AMD needed have, especially since the gtx 570/580 were launched much earlier, and the 570 was cheaper but performed on par with the 6970.
I feel as if this was intentional, but either way, this is great for the consumer for once!
I definitely don't consider myself a fanboy, as I've pretty much been switching ATIs and Nvidias for year now. You really can't go wrong with either company, but right now the 6950 is top dog in terms of price/performance by a good margin, and is going to be in short supply leading up to the inevitable PCB change.
I'm already contemplating a second unit, even tho there is no justifiable reason to do so at this moment in time besides not being able to pick up a second reference card soon. This card already rips thru anything I play.
 

Nvidia has 8x Supersampling as an option in the Control Panel, and it makes Dragon Age look great!

AMD does not have better dual card scaling:

http://www.guru3d.com/article/radeon-hd-6950-crossfirex-review/
http://www.guru3d.com/article/geforce-gtx-570-sli-review/
http://www.guru3d.com/article/geforce-gtx-580-sli-review/

Average dual card scaling across all benchmarks at 1920x1200:
6950: 1.67x
570: 1.7x
580: 1.58x

(Note that the GTX580 is much more affected by CPU bottlenecking. The 6970 would logically also be bound by CPU bottlenecking and would not scale as well as the 6950 and 570. Unfortunately, the 6970 in Crossfire was not tested.)

Noise levels vs. single card
6950: 49 Dba (+8 Dba)
570: 42 Dba (+2 Dba)
580: 46 Dba (+5 Dba)

GPU Temps vs. single card
6950: 89c (+12c)
570: 81c (+4c)
580: 90c (+3c)
 


Do you mind showing me a screen shot of this? There is no info about it being added, and it wasn't availible with the 470's. Maybe you could post a review with it listed.

The closest thing they have is a 3rd party program called Nvidia inspector which lets you use it. However, with first hand experience with this program, it's inferior in quality and performance.

If you are looking at supersampling transparency, that is not the same thing, and also inferior.
 

Digital Dissent

Distinguished
Feb 24, 2010
94
0
18,640

This depends highly on the games tested and the reviews looked at (aka benchmarks). At this point ati does seem to have better scaling, by a small margin, but this depends again on which reviews you look at.
 

Digital Dissent

Distinguished
Feb 24, 2010
94
0
18,640
+1 tested successfully here. Fully unlocked and working great, +2% tdp and 900mhz on the core clock. 1390 on memory. Pretty incredible if you ask me since thats actually beyond stock 6970 clocks.
 

Tamz_msc

Distinguished
I admit it that its difficult to find the difference.But the German websites which first brought up this issue stated that this effect is very much noticeable in some games like Crysis.

However, even if the optimizations are not visible, they do affect performance, at least in the games tested by Guru3D.

Here's your proof:
http://benchmarkreviews.com/index.php?option=com_content&task=view&id=12845&Itemid=47

Just follow the links for the videos
 
It sounds to me like AMD is doing us a favor. If it's so difficult to notice the vast majority of the time. And when you can see a difference it is barely noticeable, but gives you good returns on the change. This sounds like a good thing to me.

Now you do have the ability to adjust their AI to increase to the highest quality, it's just not there by default.
 
Stop with ur Nvidia bias..........it's not like Nvidia doesn't do the same thing,
Find me a game that shows these kinds of "cheats" and noticeable in video or screenshot without zooming 1000% , but make it a game that would still be considered a modern game in the next 2 years
The AMD driver optimizations are accepted as fact throughout the tech community. It may not be something AMD fans think is anything significant, but if you understand the issue, it's very troublesome. If Nvidia feels a need to do the same thing to keep pace, we all lose with lower quality graphics.

" For the time being, we're going to have to leave everything at default and point out the observed and confirmed image quality issue currently affecting Radeon HD 6800-series cards. This may or may not become a factor in your buying decision, but right now, the bottom line is that Nvidia offers better texture filtering, regardless of whether you’re one of the folks who can appreciate it."
http://www.tomshardware.com/reviews/geforce-gtx-570-gf110-performance,2806-5.html

"We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you lose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromises on image quality. And sure it raises other questions, does ATI compromise on other things as well ? See, the cost already outweigh the benefits.

So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default. But again here we have to acknowledge that it remains a hard to recognize and detect series of optimizations, but it is there and it can be detected."
http://www.guru3d.com/article/exploring-ati-image-quality-optimizations/
 
So with default settings, Nvidia has slightly better texture filtering, but with a single slider, they are the same. ATI has better AA. Ok, now we pick a card based on what's important to us.

I've played a few months with Nvidia recently, then played a bit over a month with ATI recently. I find the image quality (factoring in AA), much higher for ATI, because the aliasing is something distracting to me. The difference in texture is so small, it's hard to tell side be side much less on it's own.

It's hard to call me a fanboy given I do recommend both cards depending on a persons need and have been using both. So please don't bother going there.

EDIT: seems you are misleading us here. They have the same exact image filtering, they look identical, but by default, ATI choose to use a slightly lower quality setting that is right there in the CCC. I always go straight to those settings on either card, so the reason i can't tell a difference is there was none.