Multi GPU Tuning (quick guide)

The_Tester

Reputable
Nov 22, 2014
202
1
4,760
This information is intended for those new to multi GPU setups but somewhat familiar with GPU overclocking and/or overclocking in general. Also, it is intended for those that do not have any problems with the system and everything is generally working the way it should. For reference, the guide will flow in an “information then picture” format for better read flow.

Prudent specs

OS: WIN10 Pro
Motherboard: ASRock Z97 Extreme9
Graphics: Dual Gigabyte GTX970 G1 Gaming (GPU1 Top card/GPU2 Bottom card)
CPU: Intel i7 4770k (@4.2 OC)
RAM: GSkill Sniper 4x4GB (1866)

Programs of choice

Assassins Creed Unity by Ubisoft (1080p Window mode)
Why? It’s a relatively hardware heavy game that will test changes realistically. Also, window mode will allow me to view the results of on the fly tweaking (Multi GPU)

NVidia Control Panel
Why? It’s a necessary control software/tool that is needed to properly configure NVidia SLI for the given hardware used

MSI Afterburner
Why? It’s a popular and useful tool for both tweaking and monitoring statistics

Performance Test by PassMark (free/trial version)
Why? It’s a popular, useful and consistent tool for benchmarking with a variety of tests to run (Single GPU)

3DMark Advanced by Futuermark (Purchased)
Why? It’s a popular graphics benchmarking program which provides impressive stress tests that can thoroughly tax all but the most serious gaming rigs (Single or Multi GPU)

***********************************************************************************************************************​
So have you been monitoring your gaming while in SLI/Crossfire and seen values that don’t make sense or don’t match up? Most people will unless they do a bit of tweaking. The purpose of this tutorial is to get your graphics cards working in a harmonious way with the best (multi GPU) benchmarks, fps and stability. Specifically clock speed, GPU Vcore and to a lesser extent temperature.

Use a single monitor so you can test the individual cards by simply plugging the monitor into the card of choice for testing. Assuming that both/all your graphics cards are installed and working correctly, you will need to disable SLI/Crossfire. Please refer to your core techs processes for accomplishing this.

AMD Crossfire


NVidia SLI

Base operation numbers for the individual cards.

To start, disable GPU sync in your tweaking program of choice (actual name may vary by program). This will allow all the GPU’s to operate in an independent manner.


Start tuning the cards.

Passmark DirectX 9 test shown to left

Do this for all cards until you have successfully tuned each card up to its top stable performance.

Passmark DirectX 9 test shown to left

Once all your cards have been adjusted, take note of the settings or save the profiles for them. Quit the programs and turn on SLI/Crossfire. Please refer to your core techs processes for accomplishing this.

AMD Crossfire


Nvidia SLI

GPU Sync on...
It is now important to state that you need start tweaking based on the lowest performing card (I will use GPU1 for the example as the “weak” card). This is essential when creating a robust SLI/Crossfire setup. Attempting to force a lower performing card to try and keep up with a higher performing card could result in instability and driver crashes. Initially you will want to adjust the cards within GPU1’s performance specs. You can later start easing up all the cards to see if a multi GPU configuration will help or hinder individual card performance. Have a look at the core clock, GPUVcore, power and temperature. With the cards in a “stock” state, run a program that will utilize all cards and allow you to view settings on the fly (such as a game in window mode). We can see from this example that everything seems to be ok with the exception of #2 GPUVcore (shown as GPU2 Votage). We can see that GPU1 reads 1.143V and GPU2 reads 1.206V. This may be fine but can cause stability issues if left alone.

Assasins Creed Unity (window mode) shown to left

Normalize the cards

GPU Sync off...
With sync on the cards are not quite operating the same (as shown above). Now it’s time to turn sync off and fine tune things. With GPU sync off, adjust the clock speed of GPU1 up/down until the Vcore’s begin to match each other. Repeat this for all other GPU’s until all the cards show approximately the same readings. In this example we can see that GPU1 had been adjusted to a +15 core clock speed which now better equalizes the GPUVcore voltages. You can see where the GPUVcore of GPU1 increases to 1.212 (with no adjustment to GPU2). Also notice that all GPU's are now working at a higher frequency as well.

Assasins Creed Unity (window mode) shown to left

Continue adjusting the cards until you have all of the GPU’s adjusted up to the highest performance settings that were obtained with GPU1 (GPUVcore, RAM, Core clock etc...). Be sure to check that clock speed, GPU Vcore, memory and to a lesser extent GPU temperature are closely matching each other. Once you have all the cards in a harmonious state, begin running benchmark software such as 3DMark to better check performance and stability. This should let you see what you have accomplished and furthermore let you know that your setup will be rock solid. You can always try to increases the GPUVcore voltages to better match each other if the VBIOS allows it. In the example below you can see that the two cards in this case are nearly identical which is exactly what you want.

GPU1 left | GPU2 right
(Sample of 3DMark Advanced "FireStrike" recorded via MSI Afterburner for the above setup)


This concludes a brief look at the surface basics of multi GPU tuning. The main reason that two cards of the exact same make will not sync automatically are small differences in the manufacturing processes of the GPU's. Also VBIOS versions and card revisions can greatly affect a cards overall reported statistics and operation. For example if you bought the cards with a large time gap in between each card. Naturally the card with the better cooling will be the cooler card. In this tutorial you may have noticed that the top card (GPU1) is generally hotter running than the bottom card (GPU2). This is expected as GPU2 is essentially acting like a heater for GPU1. The hotter local environment of GPU1 will naturally increase its operating temperature (GPU1 sucks in the hot air that GPU2 creates).