| Test System Configuration | |
|---|---|
| Intel CPU | Intel Core i7-3770K (Ivy Bridge): 3.5 GHz, 8 MB Shared L3 Cache, LGA 1155 Overclocked to 4.4 GHz at 1.25 V |
| Intel Motherboard | Asus Sabertooth Z77, BIOS 1504 (08/03/2012) |
| Intel CPU Cooler | Thermalright MUX-120 w/Zalman ZM-STG1 Paste |
| AMD CPU | AMD FX-8350 (Vishera): 4.0 GHz, 8 MB Shared L3 Cache, Socket AM3+ Overclocked to 4.4 GHz at 1.35 V |
| AMD Motherboard | Asus Sabertooth 990FX, BIOS 1604 (10/24/2012) |
| AMD CPU Cooler | Sunbeamtech Core-Contact Freezer w/Zalman ZM-STG1 Paste |
| RAM | G.Skill F3-17600CL9Q-16GBXLD (16 GB) DDR3-2200 CAS 9-11-9-36 1.65 V |
| AMD Graphics | 2 x MSI R7970-2PMD3GD5/OC: 1,010 MHz GPU, GDDR5-5500 |
| Nvidia Graphics | 2 x Gigabyte GV-N680OC-4GD: 1,137 MHz GPU, GDDR5-6008 |
| Hard Drive | Mushkin Chronos Deluxe DX 240 GB, SATA 6Gb/s SSD |
| Sound | Integrated HD Audio |
| Network | Integrated Gigabit Networking |
| Power | Seasonic X760 SS-760KM: ATX12V v2.3, EPS12V, 80 PLUS Gold |
| Software | |
| OS | Microsoft Windows 8 Professional RTM x64 |
| AMD Graphics | AMD Catalyst 12.10 |
| Nvidia Graphics | Nvidia GeForce 310.90 |
Great performance and quick installation keep Thermalright’s MUX-120 and Sunbeamtech’s Core Contact Freezer in my inventory of favorite testing components. The brackets that come with these older samples make them non-interchangeable, however.
G.Skill’s F3-17600CL9Q-16GBXLD has a remarkable DDR3-2200 CAS 9 rating, using Intel XMP technology for semi-automatic configuration. As a non-Intel platform, the Sabertooth 990FX configures XMP values through Asus' DOCP setting.

Seasonic’s X760 provides the consistent efficiency required to assess platform power differences.

Keeping the benchmark set from our previous round cut back testing time, though it also meant utilizing older drivers. The thing to remember is that we aren't trying to compare the performance of AMD's and Nvidia's graphics cards, and we're breaking each GPU vendor into separate charts to prevent this. Rather, we're interested in how each configuration behaves attached to AMD- and Intel-based platforms.
| 3D Game Benchmarks | |
|---|---|
| Aliens vs Predator | Using AvP Tool v 1.03, SSAO/Tesselation/Shadows On Test Set 1: High Textures, No AA, 4x AF Test Set 2: Very High Textures, 4x AA, 16x AF |
| Battlefield 3 | Campaign Mode, "Going Hunting" 90-Second Fraps Test Set 1: Medium Quality Defaults (No AA, 4x AF) Test Set 2: Ultra Quality Defaults (4x AA, 16x AF) |
| F1 2012 | Steam version, In-game benchmark Test Set 1: High Quality Preset, No AA Test Set 2: Ultra Quality Preset, 8x AA |
| Elder Scrolls V: Skyrim | Update 1.7, Celedon Aethirborn Level 6, 25-Second Fraps Test Set 1: DX11, High Details No AA, 8x AF, FXAA enabled Test Set 2: DX11, Ultra Details, 8x AA, 16x AF, FXAA enabled |
| Metro 2033 | Full Game, Built-In Benchmark, "Frontline" Scene Test Set 1: DX11, High, AAA, 4x AF, No PhysX, No DoF Test Set 2: DX11, Very High, 4x AA, 16x AF, No PhysX, DoF On |
- Is AMD Self-Loathing?
- Test Settings And Benchmarks
- Results: Aliens Vs. Predator
- Results: Battlefield 3
- Results: F1 2012
- Results: The Elder Scrolls V: Skyrim
- Results: Metro 2033
- Frame Rate-Over-Time Analysis
- Results: 3DMark 11
- Power And Efficiency
- CPU-To-GPU Performance Scaling
- How Does FX Treat Your Graphics Card?


Amd is actually doing fine with their products especially with their GPUs.
Why so much hate on their CPUs i will never understand.They are cheaper aren't they?
-The i7 is stronger so of course it scaled better.
-The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.
-The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.
Nobody remembers that at the time AMD bought ATI, they already had a business partnership with Nvidia on the co-marketing of 650A chipsets (AMD Business Platform) Also at the time AMD bought ATI, ATI already had a business partnership with Intel to develop the RD600 as a replacement for the 975X. AMD's purchase left both Nvidia and Intel stranded, as it took Intel more than a year to develop a replacement for the abandoned RD600.
Depends on the cards you're using. 3-way at x8/x8/x4? Tom's Hardware did an article on how bad PCIe 2.0 x4 performed, so if you're carrying over a set of PCIe 2.0 cards from a previous system, well, I refer to the same comment that you referenced.
Amd is actually doing fine with their products especially with their GPUs.
Why so much hate on their CPUs i will never understand.They are cheaper aren't they?
-The i7 is stronger so of course it scaled better.
-The 7970 is on average a stronger card than the 680, so of course it needs a little extra CPU power.
-The differences overall were very little anyways besides the obvious things like Skyrim preferring Intel.
You hit the nail right on the head.
Truth. Didn't really see anything other than the same games that show the FX falling behind did the same thing in this test as it would with anything involving Skyrim and so forth.
I know when the AMD came out with the new FX it's single threading was still not up to intel's standards but in many of the tests that used more core's the AMD could actually keep up with Intel's cpus. Not as good all the time but it's very very easy to make these AMD cpu's look bad, just run a single threaded and/or older game at them and presto.
I would have liked to see how the video card would play into this, if AMD's running more optimized software would it's crossfire come out better as well as it's overall effect in games that make use of this.
I mean how long have we had more than one cpu core running now?
And I know it's taken the software people to come up to speed but the game board is changing and they are starting more and more to use more than one core so do you think this would be important to check out as well. And just maybe see some new data from how the AMD can use video cards if it's running software it was really designed for?
We can play this Intel single thread line till hell freezes over and we all know there will not be any surprises as long as we do.
And we also have seen a shift if low cost game setups start to favor AMD's older cpus because there is more software that can run on more core's? So lets start to even out the playing field a bit here ok?
It sounds like Intel has long since reached the point of diminishing returns. AMD, on the other hand, have realised that they needed a slight departure from the road they were travelling on with Bulldozer and that, at least for the next two iterations of the architecture, they might make some decent gains - multithreading stands to be boosted decently with Steamroller, whilst Excavator will add further IPC gains as well as a big power drop. At least, that's the idea.
Same price huh ?
There was no value score for these platforms, it was Intel's top mainstream CPU and AMD's top mainstream CPU. Nobody involved in the article cared about the AMD vs Intel argument, because the article was all about AMD's ability to support its own graphics cards from its own CPUs.
1) The differential is so small
2) AMD will never sacrifice lost sales due to special optimizations of its own parts which has the effect of driving off buyers.
Either you use AMD CPU's or you don't, either you use AMD GPU's or you don't. As to value, AMD systems do well with mid level SLI and CFX setups but not so well on the higher end parts, but when you factor in the cost of building these systems, nobody really pairs top end AMD GPU's with their AMD CPU's anyways.
I have 3 AMD systems, the FX8350 is running 6970's, the A10 is running 6850's and the 1100T is running 465's, My old Athlon 2 X4 is running 8800's and my i7 980X is running 580's. I have found with all those systems it has balance so I will not be tweaking around much. What one will notice is I don't use modern gen GPU's as I find that the quality of CFX and SLI is a bit down.