Sign in with
Sign up | Sign in
Your question

Crossfire HD 7850's user review

Last response: in Graphics & Displays
Share
February 26, 2013 5:54:38 PM

Hello all, just here to give my personal opinion and review of the HD 7850's I have in crossfire.
First off I have:
--one Powercolor HD 7850 2GB at the typical 860Mhz core clock and 1200Mhz memory clock
--one HIS HD 7850 2GB also at 860Mhz/1200Mhz
--Beta Drivers 13.2

Currently I am playing at 1920 x 1080 resolution (however im gonna whip out the eyefinity setup in a cpl days)
--All games I play are first person shooters at absolute max settings with a minor adjustment between 2X-8X aliasing (i dont really care for aliasing since i personally don't notice much of a difference)
=======================================================================================

With that being said here are the games and experiences I've had.

--Black Ops 2 multiplayer: 124-178 FPS
--Far Cry 3: 78-150 FPS
--Borderlands 2: 175-245 FPS Physx on Low
--Dishonored: 128-130 FPS
--Crysis 2: 65-102 FPS DX11 with Maldo High Resolution Textures
--Just Cause 2: 122-280 FPS 8X Aliasing
--Metro 2033: 49-82 FPS DX11
--Crysis 3: 39-65 FPS Aliasing Disabled

Dolphin Emulator
--Super Smash Brother Brawl 1080 Texture pack 60FPS or 100% Emulation almost the entire time however majority of the time i could achieve this with a single 7850 with the exception of having 4 Players on the screen
(For those that know about this, 100% emulation is hard to achieve due to sloppy coding for the emulation)
=======================================================================================

Things to Keep in Mind:
--I am currently using a VIZIO 32' LCD Tv that is 60Hz refresh rate, therefore anything over 60 frames is not registered
--Yes, I do in fact own a Wii and an actual copy of the SSBB game.
--All games that have a cliff note regarding DX11 indicates that there are multiple C++ settings you can choose that enhance performance with the degradation of quality, ie DX10 and DX9 games
--Crossfire is NOT for everyone.. there IS a bit of tweaking involved with getting what you want out of your crossfire setup
--My HD 7850's are connected using one a single crossfire bridge as thats the only option my cards have, i believe all the 7850's are single fingered crossfire capable but im not positive
--Crossfire DOES generate a lot of heat if you are not using proper cooling and airflow/fan profiles
+My First GPU is the Powercolor, other than Crysis 3 the core temperature never exceeds 56C (61C for Crysis3)
+My Second GPU (HIS) Never exceeds 53 on any game
+Both Cards use Stock coolers that are single fans, however they both have Antec Formula 6 Tim applied
+Both Cards are REFERENCE designed meaning the fan pulls air in across the PCB and exhausts out the back
=======================================================================================

What about Microstutter?!?
--I have yet to run into ANY microstuttering of any sort, even in games where there may have been some dips in FPS, although this is the case for me I attribute this to the fact that im getting some very high frames and microstutter is rumored to plague the lower frame situations, but no one is actually positive about the microstutter causes.
=======================================================================================

Tweaks/Issues?
--The only issue I have run into is that on 13.2 Beta drivers the "AMD HD Audio" driver is no longer present on CCC or my computer whatsoever, however I'm using speakers so this is not an issue for me, this would only effect those that are using native speakers in their monitor or television
--Using "MSI Afterburner" to monitor my temperatures, the only annoyance with this is that i must start the game, go to desktop and then launch afterburner in order for both my GPU1 and GPU2 temperatures to be displayed. Its a minor grievance, but a kink in the fluidity of gameplay nonetheless
=======================================================================================
Verdict::
After reading and researching countless reviews and talking with individuals in person about the horrors of crossfire i decided to dive in against the majority opinion and I am glad I did.
While Crossfire may have its ups and downs Ive had an excellent experience thus far, I understand that crossfire problems range from rig to rig and that's definitely understandable
--Don't go the crossfire route if you are not: |Patient| Technologically sound| Or Experimental|
--DO go crossfire if you already have a decent card and are not willing to sell and buy a larger one but want performance boost, this is what i did.
Even with my crossfire setup being smooth there are always going to be bumps with drivers and the like so use caution.

Thanks for reading this short novel, if you have any questions about my setup ill be happy to help!
February 26, 2013 6:07:24 PM

Thank you for sharing
m
0
l
February 26, 2013 6:08:46 PM

Absolutely, I saw there was a real lack of 7850 crossfire reviews other than benchmarks so decided it may be useful
m
0
l
Related resources
a c 171 U Graphics card
a b À AMD
March 7, 2013 3:49:28 PM

Sweet, not a dead thread...

Have you setup your eyefinity setup yet? I'm looking to get a 7950 but now considering a CF 7850. I have three screens and looking for something that can handle it. I also want to setup my HDTV over HDMI as well. You said 13.2 doesn't handle audio, do any other drivers still work? Any Eyefinity or 3+1 info would be great.
m
0
l
March 20, 2013 11:41:40 PM

Wow thank you for sharing. I needed this kind of thread. I'll be cfx my 7850.
m
0
l
March 23, 2013 2:56:54 PM

4745454b said:
Sweet, not a dead thread...

Have you setup your eyefinity setup yet? I'm looking to get a 7950 but now considering a CF 7850. I have three screens and looking for something that can handle it. I also want to setup my HDTV over HDMI as well. You said 13.2 doesn't handle audio, do any other drivers still work? Any Eyefinity or 3+1 info would be great.


Sorry have been busy doing a complete mod overhaul for a friend of mine, yes i have the eyefinity set up currently but only activate it for gaming. The games ive tested with it are:
Deus Ex human revolution: absolute max settings i get a minimum of 87 FPS
Just cause 2: max settings minimum of 90 FPS
Dishonored: max settings minimum of 101 or something close to that
Crysis 3: i think it was a minimum of 26 FPS but the game as mentioned above still has some ways to go as far as optimization
Tomb raider (2013) in eyefinity was a bit curious since it has the option for dual screen optimization but not exactly 3, however when i got it to work there would be instances (on stock clocks) that i would dip to the high 20's in fps, Not worth it in my opinion as in eyefinity the game detail is almost too much for the eyes, in a nauseating manner.

ill have to go back through some of my other games in eyefinity to check it out. Currently i have 3 monitors set up as eyefinity when activated with my vizio tv mounted just above it, so when i retexture games ill have photoshop on the tv different folders on the satellite computers and the game itself on the screen just below the tv. Its great for the utility aspect however as many know its difficult if not impossible to do a 3+1 set up for eyefinity in which you have 3 monitors set up as eyefinity and a 4th monitor for a desktop.

As far as drivers go, 13.2 beta are the best for performance by far. the lack of audio via hdmi like i said is really no big deal because i use computer speakers through the audio jacks for surround sound instead of the speakers on my tv. the 13.1 drivers i believe were capable of the audio however were talking a serious 30 frame drop in not only eyefinity but regular 1080 gameplay for games that driver communication was needed from the get go ie, Far cry 3, crysis 3, Tomb raider and the like of recent games, also Metro 2033 benefited a few frames from the increase to the newest beta drivers.

youll need active mini dp to hdmi, mini dp to dvi, an hdmi and dvi for the setup im currently running. If you have any more questions ill be happy to answer, as you can read im extremely pleased with my crossfire setup.
m
0
l
a c 171 U Graphics card
a b À AMD
March 23, 2013 7:57:08 PM

I bought a 7950, 2 mDP to DVI adapters, etc. I got it working. Sorta. I had all my screens up. But I couldn't find a way to get my 3 LCDs on my desk in one group, with my 40" TV as an extended screen. Drivers want the TV with the monitors. I said screw it, lets go play some games. NO games would load. TF2, L4D2, Audiosurf, Blur all failed to load. They would start, then kick me back to the desktop. (13.1 drivers.) I removed the drivers and could play games again, but now of course can't run eyefinity. Going to be a long weekend over here.
m
0
l
a b U Graphics card
March 24, 2013 3:44:43 AM

Few Oranges said:
Hello all, just here to give my personal opinion and review of the HD 7850's I have in crossfire.
First off I have:
--one Powercolor HD 7850 2GB at the typical 860Mhz core clock and 1200Mhz memory clock
--one HIS HD 7850 2GB also at 860Mhz/1200Mhz
--Beta Drivers 13.2

Currently I am playing at 1920 x 1080 resolution (however im gonna whip out the eyefinity setup in a cpl days)
--All games I play are first person shooters at absolute max settings with a minor adjustment between 2X-8X aliasing (i dont really care for aliasing since i personally don't notice much of a difference)
=======================================================================================

With that being said here are the games and experiences I've had.

--Black Ops 2 multiplayer: 124-178 FPS
--Far Cry 3: 78-150 FPS
--Borderlands 2: 175-245 FPS Physx on Low
--Dishonored: 128-130 FPS
--Crysis 2: 65-102 FPS DX11 with Maldo High Resolution Textures
--Just Cause 2: 122-280 FPS 8X Aliasing
--Metro 2033: 49-82 FPS DX11
--Crysis 3: 39-65 FPS Aliasing Disabled

Dolphin Emulator
--Super Smash Brother Brawl 1080 Texture pack 60FPS or 100% Emulation almost the entire time however majority of the time i could achieve this with a single 7850 with the exception of having 4 Players on the screen
(For those that know about this, 100% emulation is hard to achieve due to sloppy coding for the emulation)
=======================================================================================

Things to Keep in Mind:
--I am currently using a VIZIO 32' LCD Tv that is 60Hz refresh rate, therefore anything over 60 frames is not registered
--Yes, I do in fact own a Wii and an actual copy of the SSBB game.
--All games that have a cliff note regarding DX11 indicates that there are multiple C++ settings you can choose that enhance performance with the degradation of quality, ie DX10 and DX9 games
--Crossfire is NOT for everyone.. there IS a bit of tweaking involved with getting what you want out of your crossfire setup
--My HD 7850's are connected using one a single crossfire bridge as thats the only option my cards have, i believe all the 7850's are single fingered crossfire capable but im not positive
--Crossfire DOES generate a lot of heat if you are not using proper cooling and airflow/fan profiles
+My First GPU is the Powercolor, other than Crysis 3 the core temperature never exceeds 56C (61C for Crysis3)
+My Second GPU (HIS) Never exceeds 53 on any game
+Both Cards use Stock coolers that are single fans, however they both have Antec Formula 6 Tim applied
+Both Cards are REFERENCE designed meaning the fan pulls air in across the PCB and exhausts out the back
=======================================================================================

What about Microstutter?!?
--I have yet to run into ANY microstuttering of any sort, even in games where there may have been some dips in FPS, although this is the case for me I attribute this to the fact that im getting some very high frames and microstutter is rumored to plague the lower frame situations, but no one is actually positive about the microstutter causes.
=======================================================================================

Tweaks/Issues?
--The only issue I have run into is that on 13.2 Beta drivers the "AMD HD Audio" driver is no longer present on CCC or my computer whatsoever, however I'm using speakers so this is not an issue for me, this would only effect those that are using native speakers in their monitor or television
--Using "MSI Afterburner" to monitor my temperatures, the only annoyance with this is that i must start the game, go to desktop and then launch afterburner in order for both my GPU1 and GPU2 temperatures to be displayed. Its a minor grievance, but a kink in the fluidity of gameplay nonetheless
=======================================================================================
Verdict::
After reading and researching countless reviews and talking with individuals in person about the horrors of crossfire i decided to dive in against the majority opinion and I am glad I did.
While Crossfire may have its ups and downs Ive had an excellent experience thus far, I understand that crossfire problems range from rig to rig and that's definitely understandable
--Don't go the crossfire route if you are not: |Patient| Technologically sound| Or Experimental|
--DO go crossfire if you already have a decent card and are not willing to sell and buy a larger one but want performance boost, this is what i did.
Even with my crossfire setup being smooth there are always going to be bumps with drivers and the like so use caution.

Thanks for reading this short novel, if you have any questions about my setup ill be happy to help!


Disable ULPS to fix that issue. I was also a previous 7850 Crossfire user, no microstuttering but lots of heat :/ 
m
0
l
March 26, 2013 2:48:33 PM

JJ what do you mean by "disable ulps?" is that in accordance to my audio problem?
yeah no microstutter and oddly enough little to no heat difference between crossfire and single GPU setup, for my system anyway.
what GPU setup are you currently running?
m
0
l
April 9, 2013 5:37:24 PM

Do you have any updates on this? You seem to enjoy your set up on this and I'm looking to buy another 7850 soon. Crossfire seem's to be getting better with every new driver.
m
0
l
April 9, 2013 8:38:47 PM

easyfame said:
Do you have any updates on this? You seem to enjoy your set up on this and I'm looking to buy another 7850 soon. Crossfire seem's to be getting better with every new driver.


yeah easyfame i continually use my crossfire setup and am fully in love with it haha, I just played Bioshock Infinite of course at ultra settings and never dipped below 75 FPS. Also if youre a BF3 guy on ultra i never dip below 59 FPS this is now on the 13.3 Beta drivers for both of these games. Also Skyrim with hd textures plays really well i dont remember the exact frame count ill have to get back with you on that.
If youre interested in the 7850 crossfire id say go for it, they scale almost 100% linearly as of the 13.3 beta drivers.
Im also here to answer any questions you might have regarding the crossfire set up!
m
0
l
April 11, 2013 8:14:16 PM

Few Oranges said:
easyfame said:
Do you have any updates on this? You seem to enjoy your set up on this and I'm looking to buy another 7850 soon. Crossfire seem's to be getting better with every new driver.


yeah easyfame i continually use my crossfire setup and am fully in love with it haha, I just played Bioshock Infinite of course at ultra settings and never dipped below 75 FPS. Also if youre a BF3 guy on ultra i never dip below 59 FPS this is now on the 13.3 Beta drivers for both of these games. Also Skyrim with hd textures plays really well i dont remember the exact frame count ill have to get back with you on that.
If youre interested in the 7850 crossfire id say go for it, they scale almost 100% linearly as of the 13.3 beta drivers.
Im also here to answer any questions you might have regarding the crossfire set up!


Million dollar question would be is there or how bad is the micro-stutter. Is is just so mild that it isn't even a bother to you or do you have to tweak some settings? Really close to pulling the trigger thanks!
m
0
l
April 11, 2013 11:44:54 PM

easyfame said:
Few Oranges said:
easyfame said:
Do you have any updates on this? You seem to enjoy your set up on this and I'm looking to buy another 7850 soon. Crossfire seem's to be getting better with every new driver.


yeah easyfame i continually use my crossfire setup and am fully in love with it haha, I just played Bioshock Infinite of course at ultra settings and never dipped below 75 FPS. Also if youre a BF3 guy on ultra i never dip below 59 FPS this is now on the 13.3 Beta drivers for both of these games. Also Skyrim with hd textures plays really well i dont remember the exact frame count ill have to get back with you on that.
If youre interested in the 7850 crossfire id say go for it, they scale almost 100% linearly as of the 13.3 beta drivers.
Im also here to answer any questions you might have regarding the crossfire set up!


Million dollar question would be is there or how bad is the micro-stutter. Is is just so mild that it isn't even a bother to you or do you have to tweak some settings? Really close to pulling the trigger thanks!


Man there is literally no microstutter, at least if there is I haven't noticed it what so ever and I'm pretty picky. Even with vsync off I notice very little if any screen tearing. Let us know what you end up going with sir.
m
0
l
April 17, 2013 9:43:11 PM

Few Oranges said:
easyfame said:
Few Oranges said:
easyfame said:
Do you have any updates on this? You seem to enjoy your set up on this and I'm looking to buy another 7850 soon. Crossfire seem's to be getting better with every new driver.


yeah easyfame i continually use my crossfire setup and am fully in love with it haha, I just played Bioshock Infinite of course at ultra settings and never dipped below 75 FPS. Also if youre a BF3 guy on ultra i never dip below 59 FPS this is now on the 13.3 Beta drivers for both of these games. Also Skyrim with hd textures plays really well i dont remember the exact frame count ill have to get back with you on that.
If youre interested in the 7850 crossfire id say go for it, they scale almost 100% linearly as of the 13.3 beta drivers.
Im also here to answer any questions you might have regarding the crossfire set up!


Million dollar question would be is there or how bad is the micro-stutter. Is is just so mild that it isn't even a bother to you or do you have to tweak some settings? Really close to pulling the trigger thanks!


Man there is literally no microstutter, at least if there is I haven't noticed it what so ever and I'm pretty picky. Even with vsync off I notice very little if any screen tearing. Let us know what you end up going with sir.


Well just as an update I decided to go for the crossfire 7850 and so far no regrets. With radeonpro using dynamic frame limiter at 60, EVERY one of my games run like butter. I would recommend this.
m
0
l
May 9, 2013 9:25:27 PM

easyfame said:
Few Oranges said:
easyfame said:
Few Oranges said:
easyfame said:
Do you have any updates on this? You seem to enjoy your set up on this and I'm looking to buy another 7850 soon. Crossfire seem's to be getting better with every new driver.


yeah easyfame i continually use my crossfire setup and am fully in love with it haha, I just played Bioshock Infinite of course at ultra settings and never dipped below 75 FPS. Also if youre a BF3 guy on ultra i never dip below 59 FPS this is now on the 13.3 Beta drivers for both of these games. Also Skyrim with hd textures plays really well i dont remember the exact frame count ill have to get back with you on that.
If youre interested in the 7850 crossfire id say go for it, they scale almost 100% linearly as of the 13.3 beta drivers.
Im also here to answer any questions you might have regarding the crossfire set up!


Million dollar question would be is there or how bad is the micro-stutter. Is is just so mild that it isn't even a bother to you or do you have to tweak some settings? Really close to pulling the trigger thanks!


Man there is literally no microstutter, at least if there is I haven't noticed it what so ever and I'm pretty picky. Even with vsync off I notice very little if any screen tearing. Let us know what you end up going with sir.


Well just as an update I decided to go for the crossfire 7850 and so far no regrets. With radeonpro using dynamic frame limiter at 60, EVERY one of my games run like butter. I would recommend this.


Sorry for the late reply, have been busy with finals and what not. But I am very glad to hear that you are also satisfied with your crossfire setup. Its a shame that Crossfire and SLI have gotten such negative reputations over the years due to it being fairly new technology. I'm currently anticipating the release of Metro Last Light quite a bit. Its gonna be alot of fun considering I played the entire Metro 2033 with a single 7850 and never really had too many problems. Although majority of the time I would hover in the 40's for FPS count (I am kind of picky with anything lower than the standard 60) I didnt even notice. However games such as Far Cry 3 and Borderlands 2 (on very rare occasion) I couldnt stand anything lower than 60 FPS.
m
0
l
May 14, 2013 4:25:50 AM

Hey just wondering how the Heat is with both cards under load is,
Thinking about getting another 7850 to add to my system so your review/input is very useful.
m
0
l
May 14, 2013 8:41:27 AM

Bisudasu said:
Hey just wondering how the Heat is with both cards under load is,
Thinking about getting another 7850 to add to my system so your review/input is very useful.


I like to use Crysis 3 as a "benchmark" for heat due to it being a game that heats up my cards more than any other game ive come across. On the master card (top card) during intense gameplay in Crysis ill see temps reach about 60C and my slave card reaches about 56-57C as the max temps. This is with both GPU fans set manually at 70% and no overclocking. Also these are absolute max settings on Crysis so they do eat up a lot of resources from the GPU.
Games like Bioshock Infinite with the highest settings can have my cards hover around a max of 50C also with the fan settings on 70% however even in some intense gameplay situations the average temps for the cards in Bioshock are 47 for master card and 44 for the slave card.
Both my cards have Antec Nano Diamond thermal paste that brought down idle and load temps by 5-7 degrees after the burn in for the paste. And both of my cards are reference design, meaning the pull in ambient heat from the case and disperse the heat out of the back of the case. Unfortunately (not really unfortunate) I have yet to need to overclock the cards passed 905 on the core clock so I dont have a good temp log for overclocking these cards but from my experience the 7850's temperatures raise only very little when overclocked compared to their original load temps. Hope this helped, if you have anymore questions i would be glad to answer!
m
0
l
May 14, 2013 5:10:42 PM

Few Oranges said:
Bisudasu said:
Hey just wondering how the Heat is with both cards under load is,
Thinking about getting another 7850 to add to my system so your review/input is very useful.


I like to use Crysis 3 as a "benchmark" for heat due to it being a game that heats up my cards more than any other game ive come across. On the master card (top card) during intense gameplay in Crysis ill see temps reach about 60C and my slave card reaches about 56-57C as the max temps. This is with both GPU fans set manually at 70% and no overclocking. Also these are absolute max settings on Crysis so they do eat up a lot of resources from the GPU.
Games like Bioshock Infinite with the highest settings can have my cards hover around a max of 50C also with the fan settings on 70% however even in some intense gameplay situations the average temps for the cards in Bioshock are 47 for master card and 44 for the slave card.
Both my cards have Antec Nano Diamond thermal paste that brought down idle and load temps by 5-7 degrees after the burn in for the paste. And both of my cards are reference design, meaning the pull in ambient heat from the case and disperse the heat out of the back of the case. Unfortunately (not really unfortunate) I have yet to need to overclock the cards passed 905 on the core clock so I dont have a good temp log for overclocking these cards but from my experience the 7850's temperatures raise only very little when overclocked compared to their original load temps. Hope this helped, if you have anymore questions i would be glad to answer!

Wow so it's technically the same temps as one card by itself? Alot of people made it out to be extremely hot, are they really loud together? or about the same as one card? Really excited to upgrade to another 7850 was very dishearted with all the forums saying micro-stutter and heat was unbearable, your experience seems completely how I was hoping mine would be.

m
0
l
a c 171 U Graphics card
a b À AMD
May 14, 2013 6:39:54 PM

One thing to keep in mind is MS supposedly effects people differently. Not everyone can see/feel it. Do you remember CRT monitors? Everyone I knew wasn't bothered by a refresh rate of 60Hz. I'd sit at my moms or friends computer and INSTANTLY get a headache. (I swear I could hear it as well. Sound went away as soon as 72 or 75Hz kicked in.) Just because one guy online doesn't have a problem, or even most guys online don't have a problem doesn't mean you won't as well. If possible, try before you buy.
m
0
l
May 14, 2013 9:53:43 PM

Bisudasu said:
Few Oranges said:
Bisudasu said:
Hey just wondering how the Heat is with both cards under load is,
Thinking about getting another 7850 to add to my system so your review/input is very useful.


I like to use Crysis 3 as a "benchmark" for heat due to it being a game that heats up my cards more than any other game ive come across. On the master card (top card) during intense gameplay in Crysis ill see temps reach about 60C and my slave card reaches about 56-57C as the max temps. This is with both GPU fans set manually at 70% and no overclocking. Also these are absolute max settings on Crysis so they do eat up a lot of resources from the GPU.
Games like Bioshock Infinite with the highest settings can have my cards hover around a max of 50C also with the fan settings on 70% however even in some intense gameplay situations the average temps for the cards in Bioshock are 47 for master card and 44 for the slave card.
Both my cards have Antec Nano Diamond thermal paste that brought down idle and load temps by 5-7 degrees after the burn in for the paste. And both of my cards are reference design, meaning the pull in ambient heat from the case and disperse the heat out of the back of the case. Unfortunately (not really unfortunate) I have yet to need to overclock the cards passed 905 on the core clock so I dont have a good temp log for overclocking these cards but from my experience the 7850's temperatures raise only very little when overclocked compared to their original load temps. Hope this helped, if you have anymore questions i would be glad to answer!

Wow so it's technically the same temps as one card by itself? Alot of people made it out to be extremely hot, are they really loud together? or about the same as one card? Really excited to upgrade to another 7850 was very dishearted with all the forums saying micro-stutter and heat was unbearable, your experience seems completely how I was hoping mine would be.



yeah the heat is really not a problem in crossfire as most cards generally are. Keeping in mind that the 28nm architecture has really helped with temperatures for the 7xxx series cards. However the temps are higher than a single card as this will always be the case with more surface area to cool in the case and more sources generating heat itself. The noise can be a bit ridiculous when both cards are set on 70% or higher for the fan profile but i generally play with sound high enough for it to not matter and whenever im done gaming i turn the fan profiles back to auto immediately. specifically the master card when at higher fan speeds generates quite a bit of noise due to the slim amount of clearance if has sitting right above the second card. Overall as you yourself stated i have optimum results for a crossfire setup.
m
0
l
May 14, 2013 10:00:59 PM

4745454b said:
One thing to keep in mind is MS supposedly effects people differently. Not everyone can see/feel it. Do you remember CRT monitors? Everyone I knew wasn't bothered by a refresh rate of 60Hz. I'd sit at my moms or friends computer and INSTANTLY get a headache. (I swear I could hear it as well. Sound went away as soon as 72 or 75Hz kicked in.) Just because one guy online doesn't have a problem, or even most guys online don't have a problem doesn't mean you won't as well. If possible, try before you buy.


yeah that is some good advice as there are always situations where you can have two extreme ends for opinions. I consider myself picky when it comes to minute details and oddly enough the only thing i consistently get micro stuttering in is a n64 emulator with high resolution texture packs.. (sigh) haha. Trying out crossfire would be a bit trick but would ultimately be a great option if it were an available option that is.
For what its worth my brother is also running a HD 7850 2gb Crossfire setup with completely different specs and is also experiencing some great results. I believe his setup is 8350 black edition, 2 gigabyte 7850s 2 gb (the one with 2 fans) so its not a reference pcb. His temps are even less than mine but he also has a much larger case, the cooler master haf 922 i believe. But then again there are some atrocious reviews of the crossfire setups for the 7850s all over the web. guess its part of the gamble.
m
0
l
a c 171 U Graphics card
a b À AMD
May 14, 2013 10:32:29 PM

Few Oranges, another thing to keep in mind isn't so much the great 28nm process, but the 7850 isn't a high wattage card. People tend to think that because it trades blows with the GTX480/570 that it must use a lot of power. But I think its only around 130W. This gives it a power draw similar to the 6850 or less then the GTX460. If you were using their higher wattage cards like 7950 or 7970 you'd probably see higher temps. (depending on model you use.)
m
0
l
May 15, 2013 7:04:04 AM

4745454b said:
Few Oranges, another thing to keep in mind isn't so much the great 28nm process, but the 7850 isn't a high wattage card. People tend to think that because it trades blows with the GTX480/570 that it must use a lot of power. But I think its only around 130W. This gives it a power draw similar to the 6850 or less then the GTX460. If you were using their higher wattage cards like 7950 or 7970 you'd probably see higher temps. (depending on model you use.)


well actually the architecture of the GCN chipset is exactly why the wattage is lower on the 7xxx series cards. The more efficient the chip is (the more transistors they can cram in a smaller space) the lower the power needed to operate advance processes. This is why you can see newer cards with much higher core clocks than their predecessors but coupled with lower temps. However you are correct with the 79xx cards running hotter but this is also due to the fact that they are essentially doubling the amount of stream processors (referencing the 7970) and therefore do need much more power than the 78xx series cards and below.
m
0
l
May 16, 2013 3:25:37 PM

Few Oranges said:
Hello all, just here to give my personal opinion and review of the HD 7850's I have in crossfire.
First off I have:
--one Powercolor HD 7850 2GB at the typical 860Mhz core clock and 1200Mhz memory clock
--one HIS HD 7850 2GB also at 860Mhz/1200Mhz
--Beta Drivers 13.2

Currently I am playing at 1920 x 1080 resolution (however im gonna whip out the eyefinity setup in a cpl days)
--All games I play are first person shooters at absolute max settings with a minor adjustment between 2X-8X aliasing (i dont really care for aliasing since i personally don't notice much of a difference)
=======================================================================================

With that being said here are the games and experiences I've had.

--Black Ops 2 multiplayer: 124-178 FPS
--Far Cry 3: 78-150 FPS
--Borderlands 2: 175-245 FPS Physx on Low
--Dishonored: 128-130 FPS
--Crysis 2: 65-102 FPS DX11 with Maldo High Resolution Textures
--Just Cause 2: 122-280 FPS 8X Aliasing
--Metro 2033: 49-82 FPS DX11
--Crysis 3: 39-65 FPS Aliasing Disabled



About 2 months ago I purchased a Gigabyte HD7850 (the one with 2 fans) on the assumption I would get another one
if the performance wasn't good enough. I probably spent about a month on doing research before I bought it. I
was looking for a single card that would equal or better the performance of my SLIed GTX460 768 Mb cards which I
bought in Jan 2011. As it turned out, the HD7850 is only about 80-90 % the performance of my SLIed cards at 1920
x 1080 resolution on most of the current FPS games I like to play. Unfortunately the limitation of my GTX460
cards is the 768 Mb frame buffer (I like eye candy!) They were a lot cheaper than the 1 Gb version at the time!

Most of the reviews I've read on the HD7850 say to expect scaling performance of up to 70-80 % by adding a
second card. But I recently read an article saying that a 2nd card should double the frame rates as scaling is
more a driver issue (assuming your system can handle both cards of course). I'm sure improving drivers is part
of AMD's Gaming Evolved Program. And lately I have seen driver updates showing improvements on specific titles I
like to play.

Up until now, I've been umming and arring about purchasing another HD7850 to Xfire. It was your mini review that
made my mind up about getting that 2nd card! I've been a long time reader of Tom's Hardware and your little
review prompted me to join so I could put in my 2 cents!

As for heat and power consumption, there are a couple of this you can do. Obviously a big case with many fans in
the right places should be able to expel the warm air inside the case and keep temps as low as possible. But
having 6 or more fans inside the case does increase the noise level somewhat. Add to that another 2-6 fans on 2
video cards and the extra air pressure becomes noticeable when some serious game play is on hand.

For me, heat and noise is more a concern than power consumption even though extra power means more heat. Adding
a second video card will use more power than a single high end card with similar performance but will also be
cheaper. For this reason alone I think more people go the SLi/Xfire route to save the dollars on a more
expensive upgrade! Besides, an extra 50-100 watts more power for a couple of hours a day won't cost you more
than $10-$20 for a year.

I know whenever I get a knew video card, I like to see how much I can OC it without it exceeding about 75
degrees C. I use Furemark and only test for about 3 minutes as the temp usually stablizes by then. Furemark
pushes video cards harder than when playing a full on FPS game like Crysis so I know the temps won't get that
high when gaming. As it stands, a lot of the new video cards coming out today are already factory overclocked.
And one thing I've noticed is even a small increase from there can cause the power consuption and temps to go up
sustantially. With my SLIed GTX460s, the highest clock speed was 775 Mhz before artifacting was a problem. But
the temps were too high for my liking. So then I thought I would return to stock speed and undervolt! Yes,
undervolt! I was amazed! I went from a core voltage of 1.0v down to 0.875v which brought my systems power
consumption down from 400 watts to 320 watts (40 watts for each card). And to top that off, temps went down by
6-8 degrees C! Less heat, slower running fans. Ah, that's better! So if your card supports undervolting and your
temps are too high, give it a go.

Thanks for your thread Few Oranges. A second 7850 coming real soon!
m
0
l
May 16, 2013 7:56:44 PM

Crash Course said:
Few Oranges said:
Hello all, just here to give my personal opinion and review of the HD 7850's I have in crossfire.
First off I have:
--one Powercolor HD 7850 2GB at the typical 860Mhz core clock and 1200Mhz memory clock
--one HIS HD 7850 2GB also at 860Mhz/1200Mhz
--Beta Drivers 13.2

Currently I am playing at 1920 x 1080 resolution (however im gonna whip out the eyefinity setup in a cpl days)
--All games I play are first person shooters at absolute max settings with a minor adjustment between 2X-8X aliasing (i dont really care for aliasing since i personally don't notice much of a difference)
=======================================================================================

With that being said here are the games and experiences I've had.

--Black Ops 2 multiplayer: 124-178 FPS
--Far Cry 3: 78-150 FPS
--Borderlands 2: 175-245 FPS Physx on Low
--Dishonored: 128-130 FPS
--Crysis 2: 65-102 FPS DX11 with Maldo High Resolution Textures
--Just Cause 2: 122-280 FPS 8X Aliasing
--Metro 2033: 49-82 FPS DX11
--Crysis 3: 39-65 FPS Aliasing Disabled



About 2 months ago I purchased a Gigabyte HD7850 (the one with 2 fans) on the assumption I would get another one
if the performance wasn't good enough. I probably spent about a month on doing research before I bought it. I
was looking for a single card that would equal or better the performance of my SLIed GTX460 768 Mb cards which I
bought in Jan 2011. As it turned out, the HD7850 is only about 80-90 % the performance of my SLIed cards at 1920
x 1080 resolution on most of the current FPS games I like to play. Unfortunately the limitation of my GTX460
cards is the 768 Mb frame buffer (I like eye candy!) They were a lot cheaper than the 1 Gb version at the time!

Most of the reviews I've read on the HD7850 say to expect scaling performance of up to 70-80 % by adding a
second card. But I recently read an article saying that a 2nd card should double the frame rates as scaling is
more a driver issue (assuming your system can handle both cards of course). I'm sure improving drivers is part
of AMD's Gaming Evolved Program. And lately I have seen driver updates showing improvements on specific titles I
like to play.

Up until now, I've been umming and arring about purchasing another HD7850 to Xfire. It was your mini review that
made my mind up about getting that 2nd card! I've been a long time reader of Tom's Hardware and your little
review prompted me to join so I could put in my 2 cents!

As for heat and power consumption, there are a couple of this you can do. Obviously a big case with many fans in
the right places should be able to expel the warm air inside the case and keep temps as low as possible. But
having 6 or more fans inside the case does increase the noise level somewhat. Add to that another 2-6 fans on 2
video cards and the extra air pressure becomes noticeable when some serious game play is on hand.

For me, heat and noise is more a concern than power consumption even though extra power means more heat. Adding
a second video card will use more power than a single high end card with similar performance but will also be
cheaper. For this reason alone I think more people go the SLi/Xfire route to save the dollars on a more
expensive upgrade! Besides, an extra 50-100 watts more power for a couple of hours a day won't cost you more
than $10-$20 for a year.

I know whenever I get a knew video card, I like to see how much I can OC it without it exceeding about 75
degrees C. I use Furemark and only test for about 3 minutes as the temp usually stablizes by then. Furemark
pushes video cards harder than when playing a full on FPS game like Crysis so I know the temps won't get that
high when gaming. As it stands, a lot of the new video cards coming out today are already factory overclocked.
And one thing I've noticed is even a small increase from there can cause the power consuption and temps to go up
sustantially. With my SLIed GTX460s, the highest clock speed was 775 Mhz before artifacting was a problem. But
the temps were too high for my liking. So then I thought I would return to stock speed and undervolt! Yes,
undervolt! I was amazed! I went from a core voltage of 1.0v down to 0.875v which brought my systems power
consumption down from 400 watts to 320 watts (40 watts for each card). And to top that off, temps went down by
6-8 degrees C! Less heat, slower running fans. Ah, that's better! So if your card supports undervolting and your
temps are too high, give it a go.

Thanks for your thread Few Oranges. A second 7850 coming real soon!


Excellent! I am very glad I could help. This is precisely why I started the thread to begin with! When I myself was in the market for a possible second card for crossfire I could not find any true reviews other than theoretical benchmarks and in one case a specialized set. To say the least there was very little information and the stuff i could find was almost always negative or some sort of feedback to someone asking if crossfire is worth it and of course majority opinion shut it down. This undervolting info is also a great find, thank you for sharing! My case is a Cooler master HAF 912, (modded here and there with a paint job etc) and as you may have read heat has never been an issue with me. Oddly enough that was something I was petrified of when considering crossfire. Rest assured there are no real problems (for me at least) with heat.
Please let us know how it goes with your crossfire setup I am curious with the 2 double fan coolers on the 7850's! As mentioned above with crossfire I have yet to barely breach 60C with a slight overclock of 925Mhz on the core clock. This is with the reference coolers which is just the single intake fan over the heatsink of the card.
m
0
l
May 17, 2013 12:05:05 AM

Few Oranges said:

Excellent! I am very glad I could help. This is precisely why I started the thread to begin with! When I myself was in the market for a possible second card for crossfire I could not find any true reviews other than theoretical benchmarks and in one case a specialized set. To say the least there was very little information and the stuff i could find was almost always negative or some sort of feedback to someone asking if crossfire is worth it and of course majority opinion shut it down. This undervolting info is also a great find, thank you for sharing! My case is a Cooler master HAF 912, (modded here and there with a paint job etc) and as you may have read heat has never been an issue with me. Oddly enough that was something I was petrified of when considering crossfire. Rest assured there are no real problems (for me at least) with heat.
Please let us know how it goes with your crossfire setup I am curious with the 2 double fan coolers on the 7850's! As mentioned above with crossfire I have yet to barely breach 60C with a slight overclock of 925Mhz on the core clock. This is with the reference coolers which is just the single intake fan over the heatsink of the card.


Few Oranges, I was about to ask you what your main rig is that you use. But I see in your signature you're using an i7 2600. I myself have about 10 PCs! Four of them are quads and are still quite current (about 2-3 years old apart from some recent upgrades like video cards, CPUs and Mobos). Three of them are Phenoms, a X4 B99 itx build in a Cooler Master itx case (the 7850 was bought specifically for this as my portable gaming rig), 2 of 555 BEs (which both unlock to stable quads @ 4.0 Gig), and my beast is an i5 2500K on a Z68 Extreme4 Gen3 board. My i5 is also in a HAF 912 case powered by an Antec 750 watt Mod PSU and a Evo 212 with 2 fans on the chip! Oh yeah, it overclocks to 4.4 Ghz without breaking a sweat!

Since the Z68 Extreme4 is the only board that supports both SLi and Xfire (and is PCI Express 3.0 compatible), I plan to bench both the 7850s and GTX460s and get my own un-biased results. Unfortunately the i5 2500K (Sandy Bridge) does not support the newer PCI Express 3.0. Since the HD7850 is a PCI Express 3.0 card, there might be a slight penalty hit with that combination! I'll also bench the 460s on one of my unlocked 555 BEs, they're both implanted on a Asus EM4N98TD EVO mobo which only support SLi. This will give me a chance to see if there is a noticeable difference on 2 CPU platforms. And while I'm at it, I might as well bench the pair of 9800GTs with Zalman coolers I picked up recently for dirt cheap!

One thing I forgot to mention was how poorly some older games (such as FarCry and Crysis) play on an SLi/Xfire setup. Occasionally while playing one of these games, GPU usage drops to below 30% and that's with V-Sync off. I'm not too sure if it's a CPU limitation (older games don't use all 4 cores apparently so a faster dual core is better) or if the game is poorly coded or if it's a driver issue again. I've noticed this only in SLi (I can't vouch for Xfire until I get the 2nd card). Currently my HD7850 is in one of my unlocked Phenom II 555 BE rigs with the Asus SLi board. Just to test how well Crysis 2 is coded, I ran the game on the same level at 2 different speeds noting the CPU usage on a single graph. At 3.2 Ghz, the CPU was consistently between 50-60%. I then dropped the speed right down to 2.0 Ghz (the core down to 1.0v on the CPU, talk about cool running) and CPU usage was hovering between 70-80%! The HD7850 was run on a 22 inch screen with a native res of 1680 x 1050 (which is 85% the pixel count of 1920 x 1080). The second highest game quality setting was used and V-Sync was on at 60 FPS. I did this multiple times and I could not notice any difference in game play at all! So, do we really need to OC the CPU to see any difference if the games are coded really well?

As for getting a 2nd 7850, that will definitely be in about 2-4 weeks now. I just bought myself an Align Trex 500E Helicopter (my other ones are getting lonely!) But that's another hobby for another forum!
m
0
l
a c 171 U Graphics card
a b À AMD
May 17, 2013 2:53:28 AM

if your using vsync and hitting 60fps most of the time, you will not notice microstutter. You will notice it if you try push the cards too hard and the fps is below the vsync cap for long enough periods to notice it. If you don't plan on using vsync then you will notice microstutter/frame time variations. having a slow cpu paired with a crossfire setup will allow fps to dip more often not helping the situation. these should be something to note for people looking at crossfire setups.
m
0
l
May 17, 2013 8:06:41 AM

iam2thecrowe said:
if your using vsync and hitting 60fps most of the time, you will not notice microstutter. You will notice it if you try push the cards too hard and the fps is below the vsync cap for long enough periods to notice it. If you don't plan on using vsync then you will notice microstutter/frame time variations. having a slow cpu paired with a crossfire setup will allow fps to dip more often not helping the situation. these should be something to note for people looking at crossfire setups.


Yes I am aware of micro-stutter, but I have not experienced it myself. For awhile I thought my 460s were stuttering but it turned out to be a too low polling rate on my mouse! In Crysis 2, movement with the keyboard was silky smooth but movement with the mouse was very jerky. I don't know why it took me so long to figure it out. My old logitech gaming mouse was only polling at 125 hz. Once I bumped it up to 500 hz, the problem went away.

To make sure my 460s were not stuttering, I logged frame times on FRAPS. I saw no problem there. As for micro-stutter on Xfire 7850s, I have seen the newest drivers have made a big improvement on frame times on some of the latest titles. Bringing the frame times down closer to the equivalent rated Nvidia cards, thereby reducing noticeable stutter if any. Once I get the 2nd 7850, I will be testing for the slowest CPU speed possible that does not impact on game performance on full HD!

The only reason I use V-sync is because I notice tearing on the screen quite easily. I know the FPS goes up to well over a hundred at times with V-sync off, but even tearing at high FPS is annoying to my eyes. It makes the motion seem jerky to me!
m
0
l
a c 592 U Graphics card
a c 151 À AMD
May 17, 2013 10:54:44 AM

One of several issues with Crossfire is "runt" frames.... While a FPS monitor may be showing a high FPS number, what it is actually counting as a whole single frame is often a very small part of the screen, just a fraction of a frame (in the Frames Per Second measure). This inflates FPS readings lending the illusion that performance is higher than it actually is. When a reviewer uses proper FCAT testing techniques, what they often find is that the real or "Observed" FPS is actually no higher than a single card.

Filtering out the runt frames leads to "Practical FPS" results that are half the "Hardware FPS" (what would be measured by FRAPS, Afterburner, etc.). You can see that AMD is working on a "Prototype" driver to address the issue. Nvidia SLI does not experience the problem in any way:
http://www.tomshardware.com/reviews/radeon-hd-7990-revi...
m
0
l
May 18, 2013 12:33:00 AM

17seconds said:
One of several issues with Crossfire is "runt" frames.... While a FPS monitor may be showing a high FPS number, what it is actually counting as a whole single frame is often a very small part of the screen, just a fraction of a frame (in the Frames Per Second measure). This inflates FPS readings lending the illusion that performance is higher than it actually is. When a reviewer uses proper FCAT testing techniques, what they often find is that the real or "Observed" FPS is actually no higher than a single card.

Filtering out the runt frames leads to "Practical FPS" results that are half the "Hardware FPS" (what would be measured by FRAPS, Afterburner, etc.). You can see that AMD is working on a "Prototype" driver to address the issue. Nvidia SLI does not experience the problem in any way:


You raise an interesting point here matto17secs! It seems as this is an important issue for AMD to fix since the release of their dual GPU the 7990 as it would be similar in performance to an Xfire 7970! I wonder if the thread starter is aware of this. Or is this really an issue with the high end card only even though it could be a problem for the entire 7000 series.

Are you sure Nvidia cards are not effected by this to some degree? I seem to recall quite often in my SLied 460 setup (with V-Sync on) displaying a constant 60 FPS on the screen and panning around the screen it would seem to jitter even though FRAPS was showing no dip below 60 FPS! Maybe FRAPS is incorrect here? Once I get my 2nd 7850 I will certainly be testing for this! There is nothing more annoying than jittery game play when the hardware should be capable of high FPS.
m
0
l
a c 171 U Graphics card
a b À AMD
May 18, 2013 12:39:44 AM

Crash Course said:
17seconds said:
One of several issues with Crossfire is "runt" frames.... While a FPS monitor may be showing a high FPS number, what it is actually counting as a whole single frame is often a very small part of the screen, just a fraction of a frame (in the Frames Per Second measure). This inflates FPS readings lending the illusion that performance is higher than it actually is. When a reviewer uses proper FCAT testing techniques, what they often find is that the real or "Observed" FPS is actually no higher than a single card.

Filtering out the runt frames leads to "Practical FPS" results that are half the "Hardware FPS" (what would be measured by FRAPS, Afterburner, etc.). You can see that AMD is working on a "Prototype" driver to address the issue. Nvidia SLI does not experience the problem in any way:


You raise an interesting point here matto17secs! It seems as this is an important issue for AMD to fix since the release of their dual GPU the 7990 as it would be similar in performance to an Xfire 7970! I wonder if the thread starter is aware of this. Or is this really an issue with the high end card only even though it could be a problem for the entire 7000 series.

Are you sure Nvidia cards are not effected by this to some degree? I seem to recall quite often in my SLied 460 setup (with V-Sync on) displaying a constant 60 FPS on the screen and panning around the screen it would seem to jitter even though FRAPS was showing no dip below 60 FPS! Maybe FRAPS is incorrect here? Once I get my 2nd 7850 I will certainly be testing for this! There is nothing more annoying than jittery game play when the hardware should be capable of high FPS.


the 460's are 2 generations old which may have had stuttering problems back then, the newer 6xx cards and drivers don't exhibit this. Plus panning around with even a single card can exhibit jittering if high texture detail and draw distance are used if your filling up vram, new textures loading in a poorly optimized game engine etc. If your running vsync on, which most people do, and your hitting the vsync limit you have nothing to worry about anyway. Just get your detail settings right so your hitting the 60fps most of the time.
m
0
l
a b U Graphics card
May 18, 2013 12:45:22 AM

My 670 SLI was both cheaper and faster than my old 7950 Xfire, which was littered with problems. From my experience, SLI>Xfire. The only disadvantage was the 3GB frame buffer which would of been useful for 1440p which I play on, but then again even in C3 I still don't run out of Frame buffer at 1440p Maxed out.
m
0
l
May 18, 2013 1:41:30 AM

iam2thecrowe said:


the 460's are 2 generations old which may have had stuttering problems back then, the newer 6xx cards and drivers don't exhibit this. Plus panning around with even a single card can exhibit jittering if high texture detail and draw distance are used if your filling up vram, new textures loading in a poorly optimized game engine etc. If your running vsync on, which most people do, and your hitting the vsync limit you have nothing to worry about anyway. Just get your detail settings right so your hitting the 60fps most of the time.


That's something I will have to look into further. I have to admit, for the amount of time I've had my 460s, they haven't really had that much use! And together they outperform my single 7850, but not by much. They will end up going into my 2nd main rig once I get the 2nd 7850. And as I said in an earlier post, I'll do some extensive benching on 3 dual setups and determine what works best!
m
0
l
a c 592 U Graphics card
a c 151 À AMD
May 18, 2013 8:29:43 AM

Crash Course said:
17seconds said:
One of several issues with Crossfire is "runt" frames.... While a FPS monitor may be showing a high FPS number, what it is actually counting as a whole single frame is often a very small part of the screen, just a fraction of a frame (in the Frames Per Second measure). This inflates FPS readings lending the illusion that performance is higher than it actually is. When a reviewer uses proper FCAT testing techniques, what they often find is that the real or "Observed" FPS is actually no higher than a single card.

Filtering out the runt frames leads to "Practical FPS" results that are half the "Hardware FPS" (what would be measured by FRAPS, Afterburner, etc.). You can see that AMD is working on a "Prototype" driver to address the issue. Nvidia SLI does not experience the problem in any way:


You raise an interesting point here matto17secs! It seems as this is an important issue for AMD to fix since the release of their dual GPU the 7990 as it would be similar in performance to an Xfire 7970! I wonder if the thread starter is aware of this. Or is this really an issue with the high end card only even though it could be a problem for the entire 7000 series.

Are you sure Nvidia cards are not effected by this to some degree? I seem to recall quite often in my SLied 460 setup (with V-Sync on) displaying a constant 60 FPS on the screen and panning around the screen it would seem to jitter even though FRAPS was showing no dip below 60 FPS! Maybe FRAPS is incorrect here? Once I get my 2nd 7850 I will certainly be testing for this! There is nothing more annoying than jittery game play when the hardware should be capable of high FPS.

PC Perspective has done an exhaustive series on these issues. They found that the problems increase as you go down the list into the mid-range Crossfire setups. They also did not find any hint of the problem on Nvidia SLI.
http://www.pcper.com/reviews/Graphics-Cards/Frame-Ratin...
http://www.pcper.com/reviews/Graphics-Cards/Frame-Ratin...
m
0
l
May 18, 2013 10:36:02 AM

17seconds said:
One of several issues with Crossfire is "runt" frames.... While a FPS monitor may be showing a high FPS number, what it is actually counting as a whole single frame is often a very small part of the screen, just a fraction of a frame (in the Frames Per Second measure). This inflates FPS readings lending the illusion that performance is higher than it actually is. When a reviewer uses proper FCAT testing techniques, what they often find is that the real or "Observed" FPS is actually no higher than a single card.

Filtering out the runt frames leads to "Practical FPS" results that are half the "Hardware FPS" (what would be measured by FRAPS, Afterburner, etc.). You can see that AMD is working on a "Prototype" driver to address the issue. Nvidia SLI does not experience the problem in any way:
http://www.tomshardware.com/reviews/radeon-hd-7990-revi...


While this is informative its outdated as of the last 4 driver releases from AMD. I know this because I have tested my setup not only on a 60hz monitor but also a 120hz and 240hz monitor as well. When the frames are very high such as 140's or so in borderlands on the 120hz monitor there is an odd sensation of fluidity in movement that you just dont get from the 60hz refresh rate. If the "effective" frame rate was only half of the said frame rate then there would be some choppy scenery in the game on the higher refresh rate monitors. Also one thing to keep in mind is that although nvidia may not seem to have these issues (i am by no means a fan boy of either company) it is a very well known fact that nvidia surround has more issues than crossfire other than the high end gpus such as 680 and 690. Crossfire does suffer from driver issues upon release of fairly new cards but they have definitely pumped out much better drivers the past few releases.

Also one serious thing to keep in mind with this data is that its used in regards to BF3 which is not only geared towards nvidia cards they actually used specific encoding for the nvidia cards. Their marketing scheme even shows the nvidia symbol during startup and all that jazz. Similarly Far Cry 3 and Bioshock infinite etc. are partnered with AMD and would get similar results in favor of AMD. Im not rejecting the information at hand, im simply bringing a wider perspective to light.
m
0
l
May 18, 2013 10:39:17 AM

Crash Course said:
Few Oranges said:

Excellent! I am very glad I could help. This is precisely why I started the thread to begin with! When I myself was in the market for a possible second card for crossfire I could not find any true reviews other than theoretical benchmarks and in one case a specialized set. To say the least there was very little information and the stuff i could find was almost always negative or some sort of feedback to someone asking if crossfire is worth it and of course majority opinion shut it down. This undervolting info is also a great find, thank you for sharing! My case is a Cooler master HAF 912, (modded here and there with a paint job etc) and as you may have read heat has never been an issue with me. Oddly enough that was something I was petrified of when considering crossfire. Rest assured there are no real problems (for me at least) with heat.
Please let us know how it goes with your crossfire setup I am curious with the 2 double fan coolers on the 7850's! As mentioned above with crossfire I have yet to barely breach 60C with a slight overclock of 925Mhz on the core clock. This is with the reference coolers which is just the single intake fan over the heatsink of the card.


Few Oranges, I was about to ask you what your main rig is that you use. But I see in your signature you're using an i7 2600. I myself have about 10 PCs! Four of them are quads and are still quite current (about 2-3 years old apart from some recent upgrades like video cards, CPUs and Mobos). Three of them are Phenoms, a X4 B99 itx build in a Cooler Master itx case (the 7850 was bought specifically for this as my portable gaming rig), 2 of 555 BEs (which both unlock to stable quads @ 4.0 Gig), and my beast is an i5 2500K on a Z68 Extreme4 Gen3 board. My i5 is also in a HAF 912 case powered by an Antec 750 watt Mod PSU and a Evo 212 with 2 fans on the chip! Oh yeah, it overclocks to 4.4 Ghz without breaking a sweat!

Since the Z68 Extreme4 is the only board that supports both SLi and Xfire (and is PCI Express 3.0 compatible), I plan to bench both the 7850s and GTX460s and get my own un-biased results. Unfortunately the i5 2500K (Sandy Bridge) does not support the newer PCI Express 3.0. Since the HD7850 is a PCI Express 3.0 card, there might be a slight penalty hit with that combination! I'll also bench the 460s on one of my unlocked 555 BEs, they're both implanted on a Asus EM4N98TD EVO mobo which only support SLi. This will give me a chance to see if there is a noticeable difference on 2 CPU platforms. And while I'm at it, I might as well bench the pair of 9800GTs with Zalman coolers I picked up recently for dirt cheap!

One thing I forgot to mention was how poorly some older games (such as FarCry and Crysis) play on an SLi/Xfire setup. Occasionally while playing one of these games, GPU usage drops to below 30% and that's with V-Sync off. I'm not too sure if it's a CPU limitation (older games don't use all 4 cores apparently so a faster dual core is better) or if the game is poorly coded or if it's a driver issue again. I've noticed this only in SLi (I can't vouch for Xfire until I get the 2nd card). Currently my HD7850 is in one of my unlocked Phenom II 555 BE rigs with the Asus SLi board. Just to test how well Crysis 2 is coded, I ran the game on the same level at 2 different speeds noting the CPU usage on a single graph. At 3.2 Ghz, the CPU was consistently between 50-60%. I then dropped the speed right down to 2.0 Ghz (the core down to 1.0v on the CPU, talk about cool running) and CPU usage was hovering between 70-80%! The HD7850 was run on a 22 inch screen with a native res of 1680 x 1050 (which is 85% the pixel count of 1920 x 1080). The second highest game quality setting was used and V-Sync was on at 60 FPS. I did this multiple times and I could not notice any difference in game play at all! So, do we really need to OC the CPU to see any difference if the games are coded really well?

As for getting a 2nd 7850, that will definitely be in about 2-4 weeks now. I just bought myself an Align Trex 500E Helicopter (my other ones are getting lonely!) But that's another hobby for another forum!


As far as the difference between 2.0 and 3.0 x 16 performance difference there is seriously like a 3% difference if even that. There are very few cards that have yet to actually push the bandwidth limitations on pcie 2.0x16 let alone 3.0x16. So no worries on that part, Also every single game I have listed above I have played with vsync on to be sure that im hitting the proper fps. When i made this thread i specifically was referring to vsync off since it wouldnt be quite as helpful is i listed all my games and just said 60fps for everyone lol
m
0
l
May 18, 2013 10:46:29 AM

UPDATE: Metro Last Light
Highest settings (except i have ssao turned off)
Im hitting an average of 77.6 FPS (sometimes theyre in the high 80s sometimes theyre in the mid 50s)
Temps are GPU1: 55-59C GPU2: 53-56C
Also for fun i decided to overclock my cards the temps above are in accordance with the Master card overclocked at 910mhz and the Slave card at 1000mhz. The temps are outstanding for the overclock range. I have both cards set at 75% fan speed.

One thing to keep in mind when crossfiring is to overclock the slave card slightly higher than the master card due to the relay of information sent in a loop from the master to slave back to master card. This increase the response time and really helps the scaling near the 80% range for work load.
m
0
l
May 18, 2013 10:49:14 AM

17seconds said:
Crash Course said:
17seconds said:
One of several issues with Crossfire is "runt" frames.... While a FPS monitor may be showing a high FPS number, what it is actually counting as a whole single frame is often a very small part of the screen, just a fraction of a frame (in the Frames Per Second measure). This inflates FPS readings lending the illusion that performance is higher than it actually is. When a reviewer uses proper FCAT testing techniques, what they often find is that the real or "Observed" FPS is actually no higher than a single card.

Filtering out the runt frames leads to "Practical FPS" results that are half the "Hardware FPS" (what would be measured by FRAPS, Afterburner, etc.). You can see that AMD is working on a "Prototype" driver to address the issue. Nvidia SLI does not experience the problem in any way:


You raise an interesting point here matto17secs! It seems as this is an important issue for AMD to fix since the release of their dual GPU the 7990 as it would be similar in performance to an Xfire 7970! I wonder if the thread starter is aware of this. Or is this really an issue with the high end card only even though it could be a problem for the entire 7000 series.

Are you sure Nvidia cards are not effected by this to some degree? I seem to recall quite often in my SLied 460 setup (with V-Sync on) displaying a constant 60 FPS on the screen and panning around the screen it would seem to jitter even though FRAPS was showing no dip below 60 FPS! Maybe FRAPS is incorrect here? Once I get my 2nd 7850 I will certainly be testing for this! There is nothing more annoying than jittery game play when the hardware should be capable of high FPS.

PC Perspective has done an exhaustive series on these issues. They found that the problems increase as you go down the list into the mid-range Crossfire setups. They also did not find any hint of the problem on Nvidia SLI.
http://www.pcper.com/reviews/Graphics-Cards/Frame-Ratin...
http://www.pcper.com/reviews/Graphics-Cards/Frame-Ratin...


It certainly isn't looking good for going the CrossFire way, does it! I had a good look at those links and I will definitely agree there is a problem with CrossFire. But it really does look like it's a driver issue, and I'm sure AMD are doing their best to resolve it. Could you imagine how upset AMD video card owners (wanting to go Xfire) will be if it turns out to be a hardware flaw! Would it be possible to do a video card recall?

Here's a link to how the new Prototype driver is doing. As you can see, there is significant improvement, but still not as good as Nvidia SLi!
http://www.techngaming.com/home/news/controversial-odd/...
m
0
l
May 18, 2013 11:03:38 AM

Crash Course said:
17seconds said:
Crash Course said:
17seconds said:
One of several issues with Crossfire is "runt" frames.... While a FPS monitor may be showing a high FPS number, what it is actually counting as a whole single frame is often a very small part of the screen, just a fraction of a frame (in the Frames Per Second measure). This inflates FPS readings lending the illusion that performance is higher than it actually is. When a reviewer uses proper FCAT testing techniques, what they often find is that the real or "Observed" FPS is actually no higher than a single card.

Filtering out the runt frames leads to "Practical FPS" results that are half the "Hardware FPS" (what would be measured by FRAPS, Afterburner, etc.). You can see that AMD is working on a "Prototype" driver to address the issue. Nvidia SLI does not experience the problem in any way:


You raise an interesting point here matto17secs! It seems as this is an important issue for AMD to fix since the release of their dual GPU the 7990 as it would be similar in performance to an Xfire 7970! I wonder if the thread starter is aware of this. Or is this really an issue with the high end card only even though it could be a problem for the entire 7000 series.

Are you sure Nvidia cards are not effected by this to some degree? I seem to recall quite often in my SLied 460 setup (with V-Sync on) displaying a constant 60 FPS on the screen and panning around the screen it would seem to jitter even though FRAPS was showing no dip below 60 FPS! Maybe FRAPS is incorrect here? Once I get my 2nd 7850 I will certainly be testing for this! There is nothing more annoying than jittery game play when the hardware should be capable of high FPS.

PC Perspective has done an exhaustive series on these issues. They found that the problems increase as you go down the list into the mid-range Crossfire setups. They also did not find any hint of the problem on Nvidia SLI.
http://www.pcper.com/reviews/Graphics-Cards/Frame-Ratin...
http://www.pcper.com/reviews/Graphics-Cards/Frame-Ratin...


It certainly isn't looking good for going the CrossFire way, does it! I had a good look at those links and I will definitely agree there is a problem with CrossFire. But it really does look like it's a driver issue, and I'm sure AMD are doing their best to resolve it. Could you imagine how upset AMD video card owners (wanting to go Xfire) will be if it turns out to be a hardware flaw! Would it be possible to do a video card recall?

Here's a link to how the new Prototype driver is doing. As you can see, there is significant improvement, but still not as good as Nvidia SLi!
http://www.techngaming.com/home/news/controversial-odd/...


Lol this is exactly what i was referring to in the above post. To put it simply this is just a marketing launch done by both companies to try and downplay the opposing company's sales. When you do real life benchmarks for the equivalent competing graphic cards they almost ALWAYS go blow for blow in the ring. If you buy a GTX 690 and compare it to an HD 7990 youre going to get almost perfectly level performance. Same goes for most cards down the line (680 vs 7970) etc etc. Then most people hop on a bandwagon of sorts to try and claim their loyalty to a certain brand. I have both nvidia and AMD cards. I love both brands, both have pros and cons that make up for each other where the other falls short. The reason i am going off on a rant about this is because there always seems to be very biased information out there for the average consumer to get lost in the things that really dont matter. My GTX 670 is a beast, I love it. But I also love my crossfire hd 7850s. Both setups play majority of the games i like to play on the highest settings and when it comes down to that, does it really matter what other small snipits float around on the web? :D 
m
0
l
May 18, 2013 11:23:45 AM

Few Oranges said:
17seconds said:
One of several issues with Crossfire is "runt" frames.... While a FPS monitor may be showing a high FPS number, what it is actually counting as a whole single frame is often a very small part of the screen, just a fraction of a frame (in the Frames Per Second measure). This inflates FPS readings lending the illusion that performance is higher than it actually is. When a reviewer uses proper FCAT testing techniques, what they often find is that the real or "Observed" FPS is actually no higher than a single card.

Filtering out the runt frames leads to "Practical FPS" results that are half the "Hardware FPS" (what would be measured by FRAPS, Afterburner, etc.). You can see that AMD is working on a "Prototype" driver to address the issue. Nvidia SLI does not experience the problem in any way:
http://www.tomshardware.com/reviews/radeon-hd-7990-revi...


While this is informative its outdated as of the last 4 driver releases from AMD. I know this because I have tested my setup not only on a 60hz monitor but also a 120hz and 240hz monitor as well. When the frames are very high such as 140's or so in borderlands on the 120hz monitor there is an odd sensation of fluidity in movement that you just dont get from the 60hz refresh rate. If the "effective" frame rate was only half of the said frame rate then there would be some choppy scenery in the game on the higher refresh rate monitors. Also one thing to keep in mind is that although nvidia may not seem to have these issues (i am by no means a fan boy of either company) it is a very well known fact that nvidia surround has more issues than crossfire other than the high end gpus such as 680 and 690. Crossfire does suffer from driver issues upon release of fairly new cards but they have definitely pumped out much better drivers the past few releases.


Yes, I totally agree with you! I am not a fanboy of any company myself either. I will mostly buy a product mainly for Bang 4 Buck! I had my 460s for about a year on a board with a Phenom II 555 BE (unlocked, as a quad) but wasn't happy with the performance. I thought an upgrade to the i5 2500K + new Mobo would improve the situation but it really didn't! That was $600 I could have spent on something else. As I said in an earlier post, dropping my Phenom CPU speed to 2.0 Ghz from 3.2 Ghz made no difference in gameplay (on Crysis 2, that's all I tested). The CPU usage did go up, but that's to be expected. So it appears the upgrade was unnecessary in this case!

I bought both 8800GTS and HD4870 when they were only new on the market and paid top dollar only to be disappointed with their performance in a short period of time. What we really need to see is benchmarks made on more mainstream setups, not high end! This way the majority of the public (the Bang 4 Buck people like myself) will get a better indication of what performance they can afford. Also I said in a previous post to someone else, I'm sure the Xfire issue is driver related. Here's a link to a better driver:

http://www.techngaming.com/home/news/controversial-odd/...

I'm sure AMD spends substantial amount of money on research and development but somehow let a few driver bugs out. They just need to catch them now! I will be getting the 2nd 7850 in about 10 days. Can't really wait!!!

Edit: Just purchased another 7850 today, should have it tomorrow the 25th! Didn't get the same card as the one I already have, I bought the HIS Radeon HD7850 IceQ X Turbo 2GB (it was a bit cheaper and has 2 very thick heat-pipes). I will be mating it with the Gigabyte 2GB OC (with 2 fans). So stay tuned for some additional benchmarks sometime next week!
m
0
l
May 29, 2013 10:12:18 PM

Okay, this will be just a short post (I hope) as I haven't had to time to do some extensive benching as I'd hoped.

Delivery was slow for some reason and I didn't receive the card until the 28th. I spent a whole night trying to adjust voltage to 1.075v on the second card like the Gigabyte card (using AB, Trixx, etc..) but it appears to always boost to 1.210v as a minimum when in 3D. I have flashed several firmwares to no avail! Both cards are stable at 1150 Mhz with the cores at 1.225v. But I will mainly use stock clocks of 1000 Mhz on the core and 4800 Mhz on the memory because for an OC of about 10-15%, system power went from 350 watts to 450 watts and temps rose 10-12 degrees C while using FurMark! I have noticed in Xfire mode, increasing memory speed to say 5200 Mhz made very little to no difference in frame rates while using FurMark! Scaling is very close to 100% while using FurMark once again. I prefer to use minimum FPS to gauge performance and with the settings I used, minimum FPS went from 80 with one card to 158 in Xfire!

I only had a chance to test Xfire with Crysis 2, as the Crysis games are my favourite first person shooters! I had to install 1.9 patch as flickering was an issue and only tested with DX9 (will install DX11 Patch soon). The res was only 1680 x 1050 (17.5% less pixels than Full High Def). Everything was on Ultra settings, with V-Sync off, frames were around 90-150. With V-Sync on at 60 FPS, what can I say! As Few Oranges said in a previous post, there is a strange sense of smoothness, even during intense action! It's this smoothness which makes aiming so much easier (this will make my hit/miss ratio improve a little!) According to AB, GPU load was between 40-60% and my watt meter showing system power around 250 watts. Temps were also good at a little over 50 C for both cards! The top card has 2 fans the bottom card only 1. I set my fans % speed to match the cards temp in celcius such as 50% to 50C.

Before I finish this post I must say one thing, I did notice during game play of Crysis 2 with V-Sync off, I had AfterBurner displaying frame times. While I was around 100-120 FPS, I could see the rates flickering between 2 rates of around 10 and 20 msec! Well 20msec is only 50 FPS. I don't know how Xfire renders the frames, if it's alt frame rendering then it's probably the 2nd card causing some sort of a delay while rendering. More than likely a driver issue and I'm sure it's getting attention! If there are runt frames, I certainly didn't notice it. But tearing frames well over 100 FPS is annoying to my eyes with V-Sync off of course. I'm pretty sure it's not the cause of runt frames as my SLi rig gives me the same sensation with V-Sync off!

I did play around a little with BF3 and Cod MW3 and the smoothness of game play is absolutely amazing and well worth going Xfire IMHO. I plan on getting a single 7970 (when the prices drop) before the end of the year to go in my gaming cube and compare it to my Xfire 7850s.

But for now, that's all I can say until I do more benching!
m
0
l
a c 171 U Graphics card
a b À AMD
May 30, 2013 1:36:08 AM

Crash Course said:
Okay, this will be just a short post (I hope) as I haven't had to time to do some extensive benching as I'd hoped.

Delivery was slow for some reason and I didn't receive the card until the 28th. I spent a whole night trying to adjust voltage to 1.075v on the second card like the Gigabyte card (using AB, Trixx, etc..) but it appears to always boost to 1.210v as a minimum when in 3D. I have flashed several firmwares to no avail! Both cards are stable at 1150 Mhz with the cores at 1.225v. But I will mainly use stock clocks of 1000 Mhz on the core and 4800 Mhz on the memory because for an OC of about 10-15%, system power went from 350 watts to 450 watts and temps rose 10-12 degrees C while using FurMark! I have noticed in Xfire mode, increasing memory speed to say 5200 Mhz made very little to no difference in frame rates while using FurMark! Scaling is very close to 100% while using FurMark once again. I prefer to use minimum FPS to gauge performance and with the settings I used, minimum FPS went from 80 with one card to 158 in Xfire!

I only had a chance to test Xfire with Crysis 2, as the Crysis games are my favourite first person shooters! I had to install 1.9 patch as flickering was an issue and only tested with DX9 (will install DX11 Patch soon). The res was only 1680 x 1050 (17.5% less pixels than Full High Def). Everything was on Ultra settings, with V-Sync off, frames were around 90-150. With V-Sync on at 60 FPS, what can I say! As Few Oranges said in a previous post, there is a strange sense of smoothness, even during intense action! It's this smoothness which makes aiming so much easier (this will make my hit/miss ratio improve a little!) According to AB, GPU load was between 40-60% and my watt meter showing system power around 250 watts. Temps were also good at a little over 50 C for both cards! The top card has 2 fans the bottom card only 1. I set my fans % speed to match the cards temp in celcius such as 50% to 50C.

Before I finish this post I must say one thing, I did notice during game play of Crysis 2 with V-Sync off, I had AfterBurner displaying frame times. While I was around 100-120 FPS, I could see the rates flickering between 2 rates of around 10 and 20 msec! Well 20msec is only 50 FPS. I don't know how Xfire renders the frames, if it's alt frame rendering then it's probably the 2nd card causing some sort of a delay while rendering. More than likely a driver issue and I'm sure it's getting attention! If there are runt frames, I certainly didn't notice it. But tearing frames well over 100 FPS is annoying to my eyes with V-Sync off of course. I'm pretty sure it's not the cause of runt frames as my SLi rig gives me the same sensation with V-Sync off!

I did play around a little with BF3 and Cod MW3 and the smoothness of game play is absolutely amazing and well worth going Xfire IMHO. I plan on getting a single 7970 (when the prices drop) before the end of the year to go in my gaming cube and compare it to my Xfire 7850s.

But for now, that's all I can say until I do more benching!


yes its alternate frame rendering. Plenty of sites have tested this, the latency is improved with a prototype driver in some games. http://techreport.com/review/24703/amd-radeon-hd-7990-g...
better, but not fix for all games, im sure one day they will fix this crossfire problem that has existed since the beginning, but is only now being worked on due to sites like Techreport, asking the hard questions.
m
0
l
May 30, 2013 1:49:51 AM

iam2thecrowe said:
Crash Course said:
Okay, this will be just a short post (I hope) as I haven't had to time to do some extensive benching as I'd hoped.

Delivery was slow for some reason and I didn't receive the card until the 28th. I spent a whole night trying to adjust voltage to 1.075v on the second card like the Gigabyte card (using AB, Trixx, etc..) but it appears to always boost to 1.210v as a minimum when in 3D. I have flashed several firmwares to no avail! Both cards are stable at 1150 Mhz with the cores at 1.225v. But I will mainly use stock clocks of 1000 Mhz on the core and 4800 Mhz on the memory because for an OC of about 10-15%, system power went from 350 watts to 450 watts and temps rose 10-12 degrees C while using FurMark! I have noticed in Xfire mode, increasing memory speed to say 5200 Mhz made very little to no difference in frame rates while using FurMark! Scaling is very close to 100% while using FurMark once again. I prefer to use minimum FPS to gauge performance and with the settings I used, minimum FPS went from 80 with one card to 158 in Xfire!

I only had a chance to test Xfire with Crysis 2, as the Crysis games are my favourite first person shooters! I had to install 1.9 patch as flickering was an issue and only tested with DX9 (will install DX11 Patch soon). The res was only 1680 x 1050 (17.5% less pixels than Full High Def). Everything was on Ultra settings, with V-Sync off, frames were around 90-150. With V-Sync on at 60 FPS, what can I say! As Few Oranges said in a previous post, there is a strange sense of smoothness, even during intense action! It's this smoothness which makes aiming so much easier (this will make my hit/miss ratio improve a little!) According to AB, GPU load was between 40-60% and my watt meter showing system power around 250 watts. Temps were also good at a little over 50 C for both cards! The top card has 2 fans the bottom card only 1. I set my fans % speed to match the cards temp in celcius such as 50% to 50C.

Before I finish this post I must say one thing, I did notice during game play of Crysis 2 with V-Sync off, I had AfterBurner displaying frame times. While I was around 100-120 FPS, I could see the rates flickering between 2 rates of around 10 and 20 msec! Well 20msec is only 50 FPS. I don't know how Xfire renders the frames, if it's alt frame rendering then it's probably the 2nd card causing some sort of a delay while rendering. More than likely a driver issue and I'm sure it's getting attention! If there are runt frames, I certainly didn't notice it. But tearing frames well over 100 FPS is annoying to my eyes with V-Sync off of course. I'm pretty sure it's not the cause of runt frames as my SLi rig gives me the same sensation with V-Sync off!

I did play around a little with BF3 and Cod MW3 and the smoothness of game play is absolutely amazing and well worth going Xfire IMHO. I plan on getting a single 7970 (when the prices drop) before the end of the year to go in my gaming cube and compare it to my Xfire 7850s.

But for now, that's all I can say until I do more benching!


yes its alternate frame rendering. Plenty of sites have tested this, the latency is improved with a prototype driver in some games. http://techreport.com/review/24703/amd-radeon-hd-7990-g...
better, but not fix for all games, im sure one day they will fix this crossfire problem that has existed since the beginning, but is only now being worked on due to sites like Techreport, asking the hard questions.


Yes I have seen that review before, that's how I know of the prototype driver. There still are some nasty spikes in some of the captured frames on some titles regardless of manufacturer (Nvidia/Amd). No doubt these spikes can be perceived as stutter! I suppose it's games like Crysis 3 that can tax even the best system money can buy today. It's definitely the game engines of today that are pushing the video cards of tomorrow to the limit!
m
0
l
a c 592 U Graphics card
a c 151 À AMD
May 30, 2013 11:30:26 AM

The problem with developing the Prototype driver is, how to decrease frame latency without also reducing performance? As we have seen in the charts, when you factor out the runt frames, Crossfire simply does not perform all that well. The Observed FPS is the real, actual performance of the Crossfire setup completely stripped of the illusion provided by framerate boosting runts.
m
0
l
June 5, 2013 8:09:05 PM

Crash Course said:
Okay, this will be just a short post (I hope) as I haven't had to time to do some extensive benching as I'd hoped.

Delivery was slow for some reason and I didn't receive the card until the 28th. I spent a whole night trying to adjust voltage to 1.075v on the second card like the Gigabyte card (using AB, Trixx, etc..) but it appears to always boost to 1.210v as a minimum when in 3D. I have flashed several firmwares to no avail! Both cards are stable at 1150 Mhz with the cores at 1.225v. But I will mainly use stock clocks of 1000 Mhz on the core and 4800 Mhz on the memory because for an OC of about 10-15%, system power went from 350 watts to 450 watts and temps rose 10-12 degrees C while using FurMark! I have noticed in Xfire mode, increasing memory speed to say 5200 Mhz made very little to no difference in frame rates while using FurMark! Scaling is very close to 100% while using FurMark once again. I prefer to use minimum FPS to gauge performance and with the settings I used, minimum FPS went from 80 with one card to 158 in Xfire!

I only had a chance to test Xfire with Crysis 2, as the Crysis games are my favourite first person shooters! I had to install 1.9 patch as flickering was an issue and only tested with DX9 (will install DX11 Patch soon). The res was only 1680 x 1050 (17.5% less pixels than Full High Def). Everything was on Ultra settings, with V-Sync off, frames were around 90-150. With V-Sync on at 60 FPS, what can I say! As Few Oranges said in a previous post, there is a strange sense of smoothness, even during intense action! It's this smoothness which makes aiming so much easier (this will make my hit/miss ratio improve a little!) According to AB, GPU load was between 40-60% and my watt meter showing system power around 250 watts. Temps were also good at a little over 50 C for both cards! The top card has 2 fans the bottom card only 1. I set my fans % speed to match the cards temp in celcius such as 50% to 50C.

Before I finish this post I must say one thing, I did notice during game play of Crysis 2 with V-Sync off, I had AfterBurner displaying frame times. While I was around 100-120 FPS, I could see the rates flickering between 2 rates of around 10 and 20 msec! Well 20msec is only 50 FPS. I don't know how Xfire renders the frames, if it's alt frame rendering then it's probably the 2nd card causing some sort of a delay while rendering. More than likely a driver issue and I'm sure it's getting attention! If there are runt frames, I certainly didn't notice it. But tearing frames well over 100 FPS is annoying to my eyes with V-Sync off of course. I'm pretty sure it's not the cause of runt frames as my SLi rig gives me the same sensation with V-Sync off!

I did play around a little with BF3 and Cod MW3 and the smoothness of game play is absolutely amazing and well worth going Xfire IMHO. I plan on getting a single 7970 (when the prices drop) before the end of the year to go in my gaming cube and compare it to my Xfire 7850s.

But for now, that's all I can say until I do more benching!


Excellent to hear Crash! I am glad to hear youre enjoying your setup and I appreciate the feedback. As for scaling purposes to possibly decrease the latency (frame times) between the slave card and the master card always run your slave card slightly higher on the core clock and, if need be, the memory clock as well. I have noticed a major improvement in the smoothness of frame delivery. Especially with the GPU intense games like Crysis, Far Cry 3 and Metro LL. This works for me really well so it may help decrease the frame latency for you!
m
0
l
June 6, 2013 8:09:39 AM

Few Oranges said:

Excellent to hear Crash! I am glad to hear youre enjoying your setup and I appreciate the feedback. As for scaling purposes to possibly decrease the latency (frame times) between the slave card and the master card always run your slave card slightly higher on the core clock and, if need be, the memory clock as well. I have noticed a major improvement in the smoothness of frame delivery. Especially with the GPU intense games like Crysis, Far Cry 3 and Metro LL. This works for me really well so it may help decrease the frame latency for you!


Yes, everything is going well with the CrossFire setup. I've been real busy doing fresh installs (and backing up) of windows on the new setup and my older SLi setup (even though I've had the SLi setup for a couple of years, it really hasn't been used that much, both fans on the 460 cards don't even have any dust on them!) I plan on keeping both systems as I will continue to do some more benching on the two different platforms. Besides, it's hard to sell older technology that's a few years old even if it is like new!

So far with the 7850s, every game I've tested I have been able to max all settings (with V-Sync on, I don't like tearing, even at high frame rates) except for Crysis 3! But I think most gamers will be in the same boat with that game! V-Sync keeps the wattage and temps down so I use it all the time unless I'm benching of course. It appears my Xfire 7850s are about 50-60% faster than my SLi 460s. I thought it would be more as the 460s are only 768mb versions. I'll do more stringent testing to verify this soon.

Few Oranges, thanks for your advice on the slave card speed. I will have a chance to do more benching after the weekend so I'll give it a go then
m
0
l
June 6, 2013 9:00:29 PM

Hey thanks for the great and informative thread. I currently have a gigabyte 7870 ghz edition (its a pitcairin not the chopped down tahiti), and I would love to match or exceed the performance of some of the higher end cards with a Xfire setup. I do have some questions and concerns that perhaps you could could help alleviate. First is the horrific performance issues toms has posted up in the recent past. It appears as though the drivers dont even utilize the second card. This may be a architectural issue as the only Xfire benchmarks I can find are on Tahiti gpu's. That being said it seems none of you experience these problems at all and are posting some great FPS. So it has tempted me to go ahead with it. Second concern is that I am using an fx-8350 (I know I know but I'm have been an AMD diehard for a long time) and am worried I may hit a bottleneck. Thanks for any input.
m
0
l
June 6, 2013 10:04:06 PM

gity69 said:
Hey thanks for the great and informative thread. I currently have a gigabyte 7870 ghz edition (its a pitcairin not the chopped down tahiti), and I would love to match or exceed the performance of some of the higher end cards with a Xfire setup. I do have some questions and concerns that perhaps you could could help alleviate. First is the horrific performance issues toms has posted up in the recent past. It appears as though the drivers dont even utilize the second card. This may be a architectural issue as the only Xfire benchmarks I can find are on Tahiti gpu's. That being said it seems none of you experience these problems at all and are posting some great FPS. So it has tempted me to go ahead with it. Second concern is that I am using an fx-8350 (I know I know but I'm have been an AMD diehard for a long time) and am worried I may hit a bottleneck. Thanks for any input.


Have a look at this as a guideline, it's still the same generation of GPU but on the high end:
http://www.anandtech.com/bench/Product/768?vs=769

The amount of FPS you will get will depend on the title and the resolution you play at. As you can see, scaling will vary a bit. I've read quite a few articles claiming that 100% scaling is theoretical. But you can see that some benchmarks actually show 100% scaling is possible. So much for theoretical! I believe as long as the hardware can push the data through quick enough, 100% scaling should be possible on most titles.

Anyway, what I suggest you do is play some of your games at the quality/resolution you like to play at and keep a copy of task manager in the background while playing. Make sure you have the performance tab active showing one graph for all CPU cores. After playing your game for minute or so, swap to the graph. If your usage is around 60% or below, your CPU should be OK with the 2nd card. If you haven't OCed your CPU at all then you still have plenty of headroom!

Oh, and one last thing. Make sure you motherboard is CrossFire capable!

Edit: In response to your concerns about Xfire issues, I have not experienced any problems like stuttering/runt frames. They are not visible to my eyes anyway. Driver updates do appear to be improving Xfire performance at least in benchmarks. Heck, even my trusty old GTX460s 768mb are performing better than they were 2 years ago with newer drivers! But I have to admit, I don't run my gaming rigs on XP or Vista anymore. It's 64bit Seven all the way!
m
0
l
June 11, 2013 8:02:59 PM

gity69 said:
Hey thanks for the great and informative thread. I currently have a gigabyte 7870 ghz edition (its a pitcairin not the chopped down tahiti), and I would love to match or exceed the performance of some of the higher end cards with a Xfire setup. I do have some questions and concerns that perhaps you could could help alleviate. First is the horrific performance issues toms has posted up in the recent past. It appears as though the drivers dont even utilize the second card. This may be a architectural issue as the only Xfire benchmarks I can find are on Tahiti gpu's. That being said it seems none of you experience these problems at all and are posting some great FPS. So it has tempted me to go ahead with it. Second concern is that I am using an fx-8350 (I know I know but I'm have been an AMD diehard for a long time) and am worried I may hit a bottleneck. Thanks for any input.


No, your cpu is nowhere near a bottleneck for this crossfire setup, bottlenecks are really more geared towards low clocked dual core systems. Also most games truly only utilize a single core at a time. For fun though you can always bench with your cpu at stock and then juice it a bit to see if there is really any difference between the results.
As far as the results youre referring to, those are fairly old tests and do reflect the driver performance for that time, however weve gone through many revisions in drivers since then. 13.6 Beta's are the most recent at the time of this post.
As an update if you are interested I will be dishing out some more FPS counts on games such as "Remember Me" and "Deadpool." If you have any more questions please feel free to ask!
m
0
l
June 12, 2013 4:30:07 PM

Thanks for your information, and reassurance. I'm definitely looking forward to getting some delicious Xfire on my 7870's. I will also look at some benchmarks as soon as I get everything up and running. I am looking forward to some more benchmarks from you as well.
m
0
l
!