I have an 6950 OC to 910/1300 with 2500k stock. On just cause 2 benchmark I went from 75 fps with 4aa down to 30 fps with 8aa. Everything else is on max, iis that fps hit normal? Using LG 50 plasma if that matters.
PowerTune
Judging by the amount of time we and fellow journalists spent asking questions about this new feature, we’ll try to take out time with this one and go for clarity. PowerTune is being pitched by ATI as a way to deliver more performance in typical applications by managing the maximum power draw of the GPU, and therefore the card. That’s true… from a certain point of view.
Just as Nvidia did with the GeForce GTX 570 1.3GB and the GeForce GTX 580 1.5GB, ATI has set an upper power draw limit for the two cards of the HD 6900-series. If at any time the GPU starts to consume any more than this, the GPU will automatically clock itself down to reduce the power draw. ATI says this allows it to set higher frequencies for its GPUs, as it no longer has to consider worst-case, thermal-virus-like applications.
For example, the HD 6970 2GB has a default frequency of 880MHz and it should run at this frequency whether it’s churning through World of Warcraft, Arma II or Bad Company 2. However, if the GPU should encounter something a little more taxing (and no, Crysis doesn’t count) the GPU might drop to a frequency of 800MHz. Previously, ATI would have had to set the frequency to 790MHz, but now it can deliver 90MHz more of gaming performance, hence the claims that PowerTune helps to deliver more performance.