Sign in with
Sign up | Sign in
Your question
Closed

Is the AMD FX 8350 good for gaming - Page 4

Last response: in CPUs
Share
a c 210 à CPUs
April 2, 2013 6:25:31 PM

Yeah, I am starting to think anyone who asks.."Intel vs AMD who wins" or something along those lines should just be referred to most of pages 2 and 3 of this thread...lol
April 3, 2013 4:43:14 AM

ericjohn004 said:
Yeah, I agree with you there. Some people are totally not fair to the FX. I read this review yesterday. And everything the FX did great, they were like "well it seems to be ok at this, but look at this single threaded benchmark!". So I see why if you own an FX you'd be completely pissed at some of the reviews out there.

When I actually read Tom's review on the 8350, it surprised the hell out of me. Not only was it literally nipping at the heels of the 3770k like you say, in some of the benchmarks, but it wasn't as bad at single threaded tasks like I had once thought. I reread the review yesterday since we were chatting about this and I was surprised the say the least. And Tom's does do a very fair review. They don't even use SuperPi and some of the other stuff that's not optimized for AMD.

And like I said in my last post. I'm sure 1866mhz memory does give the 8350 an improvement. I just don't think it's quite as much as some people would like to believe. Maybe it does, IDK, I haven't read as much into it as you have.

But yeah, the 8350 is really super impressive as far as multithreading goes. And in theoretical benchmarks like Sisoft Sandra(which I hate, it give my 3570k low scores), the 8350 does really well. And for only 179.99, I can definitely see why people get this processor. I'm going to be doing a build for my little nephew. He wants a cheaper gaming computer. And I think I'm going to stick an FX6300 in there. You absolutely can't beat the FX6300 for only 129.99. Can't beat it.


1) I believe that owners of a piledriver FX chip don't care about biased reviews because they are very happy with their build. I think that the point here is that future buyers can be influenced toward certain brands by biased reviews. The other day I read a typical power consumption AMD vs Intel and the AMD was looking as power hungry in the graphs... except that the AMD was using a micro-ATX and the Intel a mini-ITX. The same site has another review comparing both form-factors and found about 20W difference with the same chip, same ram, same everything! Do I need to say more?

2) To be fair, Super PI is not optimized for AMD neither for Intel. From the Wikipedia:

Quote:
Super PI is single threaded, so its relevance as a measure of performance in the current era of multi-core processors is diminishing quickly.


Being singled threaded, Super PI is using roughly a 25% of your i5-3570k and a 12.5% of the FX-8350.
a b à CPUs
April 3, 2013 8:36:09 AM

juanrga said:
ericjohn004 said:
Yeah, I agree with you there. Some people are totally not fair to the FX. I read this review yesterday. And everything the FX did great, they were like "well it seems to be ok at this, but look at this single threaded benchmark!". So I see why if you own an FX you'd be completely pissed at some of the reviews out there.

When I actually read Tom's review on the 8350, it surprised the hell out of me. Not only was it literally nipping at the heels of the 3770k like you say, in some of the benchmarks, but it wasn't as bad at single threaded tasks like I had once thought. I reread the review yesterday since we were chatting about this and I was surprised the say the least. And Tom's does do a very fair review. They don't even use SuperPi and some of the other stuff that's not optimized for AMD.

And like I said in my last post. I'm sure 1866mhz memory does give the 8350 an improvement. I just don't think it's quite as much as some people would like to believe. Maybe it does, IDK, I haven't read as much into it as you have.

But yeah, the 8350 is really super impressive as far as multithreading goes. And in theoretical benchmarks like Sisoft Sandra(which I hate, it give my 3570k low scores), the 8350 does really well. And for only 179.99, I can definitely see why people get this processor. I'm going to be doing a build for my little nephew. He wants a cheaper gaming computer. And I think I'm going to stick an FX6300 in there. You absolutely can't beat the FX6300 for only 129.99. Can't beat it.


1) I believe that owners of a piledriver FX chip don't care about biased reviews because they are very happy with their build. I think that the point here is that future buyers can be influenced toward certain brands by biased reviews. The other day I read a typical power consumption AMD vs Intel and the AMD was looking as power hungry in the graphs... except that the AMD was using a micro-ATX and the Intel a mini-ITX. The same site has another review comparing both form-factors and found about 20W difference with the same chip, same ram, same everything! Do I need to say more?

2) To be fair, Super PI is not optimized for AMD neither for Intel. From the Wikipedia:

Quote:
Super PI is single threaded, so its relevance as a measure of performance in the current era of multi-core processors is diminishing quickly.


Being singled threaded, Super PI is using roughly a 25% of your i5-3570k and a 12.5% of the FX-8350.


Yeah that is very true. But personally I like it because it goes to show you how fast each core can be. And with still a large amount of single threaded programs out there, the performance of a single core is still very relevent. But I believe that in 3-4 years, single threaded performance won't matter almost at all. Everything will be about multithreading. Which is why in 3-4 years, the more cores you have the better off you'll be.

The amount of time it takes depends on how fast Intel starts making all Hyperthreaded parts. As long as they keep making a good value 4 core processor then programs and some games will stay the way they are. Once they start making all their chips with hyperthreading then programs will start to use everything that's available.

Plus, like I know everyone with an AMD likes to point out, the new PS4 IS coming with 8 cores. So now all the games will start to be designed with 8 cores. But that's just starting right now, and even the games optimized for 8 cores aren't really all that optimized and still perform at almost 100% on 4 cores. So I don't see this changing right away. It'll take 3-5 years.

So this will be the case in 3-5 years. I'm just glad that now, while I only have a 4 core processor, that this isn't the case at the moment. And literally everything I do uses 4 cores or less because I only game and surf the internet. Of course I occassionally run multithreaded benchmarks, but like I've said, my overclocked i5 beats an i7 in most multithreading tests anyways, so it doesn't matter. This is why it's important to have strong cores. I like to know that with the progarms I use, that it can't get any faster than what I'm running them with.

Related resources
a c 210 à CPUs
April 3, 2013 9:39:25 AM

Well...probably, in less than 2 years, 4 cores will become the accepted standard...

But for now you are in great shape and a while yet still.
April 3, 2013 5:14:30 PM

drinvis said:

Check borderlands 2 here: http://www.xbitlabs.com/images/cpu/fx-8350-8320-6300-43...
GPU used was a GTX680.So it quite clear that these are not any bogus links.

That's a pretty lame comment you made by saying Hitman absolution is not using Fx8350's full potential.
You expect every game out there to scale to eight cores on this date,really?Not every game engine is same,not every development is done on the same way.How many games from 2008-2012 are threaded enough to scale well across 8 cores??

And with quoting those 3DS MAX benchmarks you tried to twist things again in your favor.
Let me correct you:Idontcare tested with 3DS MAX 2013 and in that i7 3770k was the best of the three while
i5 3570k OCed was as good or better than fx8350 OCed,and for your information 3D rendering softwares can easily task all the cores to full.Just checkout what Idoncare had to say about rendering in 3DS MAX2013.There is no exaggeration here.
The 2 pics you showed out of that forum is of 3DS MAX 2011.
Priviewing things in viewport and many other things in 3DS MAX are light threaded which benchmarks never show so again i5 3570k with all round performance was capability is a pretty good choice.It would have been better if you had read that whole thread rather than just picking one thing which was a benchmark on an older version of the software.
And that thing only was quoted since you implied that an OCed fx8350 would be lot better than an OCed i5 3570k.
How many people care about which achieves the highest clock??i5 3570k is able to do at the level of 4.7-4.8 and is more responsive to frequency by design.Most people don't buy an unlocked processor to run it at stock.
In the softwares used in system builder marathon by 2013 OCed performance of i5 3570k was good 20-25% better than fx8350.The gap would reduce if more multi-threaded softwares are used even though toms did a pretty good job by running many kinds of apps which are used by many of us on a daily basis.

Nobody ever said that fx8350 was a bad chip for gaming.It is just in some cases/games it is not able to keep with i5 3570k(already quoted cases like Civilization 5,skyrim etc).In my country the fx8350 setup costs much lesser than an i5 3570k set up,so if I was money-limited and my sole purpose was gaming I would take the fx8350 and invest the rest on a better GPU.But that is not the context here.If I was not money bound and gaming is not important then i would choose whatever best suits for the tasks I do out of both fx8350 and i5 3570k.
fx8350 is a pretty solid value for the price especially for video encoding and some other heavily threaded tasks.It has full instruction set support,not cut any corners like intel with their k series processor like by not having VT-d.
Per core performance is not on par with i5 3570k.What i5 3570k has is good all round performance i.e.good single threaded as well as multi-threaded performance.I don't think any of these chips can be termed overall bad or something.
ericjohn004's point on gaming and how it is tested on some sites is right.Like many sites test on 1280*800 to show CPU side of things but that is not relevant to most.


1) I already gave you a Borderlands 2 benchmark in the same message that you are replying now. It gives a difference of about 3 FPS between the FX-8350 and the expensive i7-3770K playing at @ 1080p. Taking into account the standard error, this means a tie between the two processors.

You ask me to check Borderlands 2 at xbitlabs link. Ok. Here is a first analysis. Their 'review' was run with the AMD processors at stock speeds but with the Intel chips run with overclocked ram by about a 17% (in MHz). Moreover they used a memory kit with an Intel XMP profile #1, but they did not run any AMD memory profile (AMP).

They used a top of game ASUS P8Z77-V Deluxe for the Intel chips, but a ASUS Crosshair V Formula for the AMD chips; precisely the V Formula-Z was released with improvements for the Piledriver architecture; moreover, they run Windows 8 and whereas the Deluxe is Windows 8 ready the Formula is not (only the Formula Z is Windows 8 ready). After all this they find that the i7-3770K gives about 9 FPS more than the FX-8350 playing @ 1080p. Subtract the effect from overclocked ram and you get about 7-8 FPS. Taking into account the standard error, this means again a tie between the two processors.

2) I know I said this to you before, but I will do again and the last time:

The FX-8350 is a good gaming platform albeit current games use roughly a 25-50% of the eight-core chip. An entire new generation of eight-core games is being developed thanks to future consoles such as the PS4, which is based in an eight-core chip from AMD. Sony is already selling PS4 development kits based in an eight-core FX chip to game developers. Therefore, the FX-8350 will be a better gaming platform in the future.

3) I already explained to you why the Idontcare test was biased. I asked you for a confirmation that 3DS MAX is taking all FX cores to maximum, but you have not given me any (so far as I know it is optimized for Intel chips but not for AMD chips).

4) I already answered the OC stuff.

5) You continue missing an important point I will remark this again and the last time:

Although many current applications on windows are not 'heavily' multi-threaded and do not use eight-cores chips, many FX-8350 owners like to run two or three applications at once. It is not unusual that a FX-8350 owner was gaming whereas doing some work at background.

Do you need some owner comments to be posted here?
a b à CPUs
April 3, 2013 8:17:17 PM

One thing to remember is that while the fx does have 8 threads and the i5 only 4 the i5 has faster threads, making comparisons of 25% of an FX chip = 50% of an i5 problematic ("Being singled threaded, Super PI is using roughly a 25% of your i5-3570k and a 12.5% of the FX-8350.") Simply because of the fact that 20% of an i5 chip != 20% of a FX.

The difference (assuming optimized software) will only be as great as the theoretical performance difference. I know sandra is only a theoretical test but generally the results seem to scale to heavily threaded workloads fairly well (Its not a one to one ratio but rather something like 0.75). FX will probably have a better time hitting that theoretical maximum because modules> hyperthreading. But as the review shows there are a few tests where the 8350 beats the i7, in many it still looses. http://www.anandtech.com/bench/Product/551?vs=697

http://www.tomshardware.com/reviews/fx-8350-vishera-rev...

The results generally support this. On average the 8350 is pretty much right between the i5 and the i7 in multithreaded tasks (but generally closer to the i7).
April 4, 2013 9:55:22 AM

juanrga said:
drinvis said:

Check borderlands 2 here: http://www.xbitlabs.com/images/cpu/fx-8350-8320-6300-43...
GPU used was a GTX680.So it quite clear that these are not any bogus links.

That's a pretty lame comment you made by saying Hitman absolution is not using Fx8350's full potential.
You expect every game out there to scale to eight cores on this date,really?Not every game engine is same,not every development is done on the same way.How many games from 2008-2012 are threaded enough to scale well across 8 cores??

And with quoting those 3DS MAX benchmarks you tried to twist things again in your favor.
Let me correct you:Idontcare tested with 3DS MAX 2013 and in that i7 3770k was the best of the three while
i5 3570k OCed was as good or better than fx8350 OCed,and for your information 3D rendering softwares can easily task all the cores to full.Just checkout what Idoncare had to say about rendering in 3DS MAX2013.There is no exaggeration here.
The 2 pics you showed out of that forum is of 3DS MAX 2011.
Priviewing things in viewport and many other things in 3DS MAX are light threaded which benchmarks never show so again i5 3570k with all round performance was capability is a pretty good choice.It would have been better if you had read that whole thread rather than just picking one thing which was a benchmark on an older version of the software.
And that thing only was quoted since you implied that an OCed fx8350 would be lot better than an OCed i5 3570k.
How many people care about which achieves the highest clock??i5 3570k is able to do at the level of 4.7-4.8 and is more responsive to frequency by design.Most people don't buy an unlocked processor to run it at stock.
In the softwares used in system builder marathon by 2013 OCed performance of i5 3570k was good 20-25% better than fx8350.The gap would reduce if more multi-threaded softwares are used even though toms did a pretty good job by running many kinds of apps which are used by many of us on a daily basis.

Nobody ever said that fx8350 was a bad chip for gaming.It is just in some cases/games it is not able to keep with i5 3570k(already quoted cases like Civilization 5,skyrim etc).In my country the fx8350 setup costs much lesser than an i5 3570k set up,so if I was money-limited and my sole purpose was gaming I would take the fx8350 and invest the rest on a better GPU.But that is not the context here.If I was not money bound and gaming is not important then i would choose whatever best suits for the tasks I do out of both fx8350 and i5 3570k.
fx8350 is a pretty solid value for the price especially for video encoding and some other heavily threaded tasks.It has full instruction set support,not cut any corners like intel with their k series processor like by not having VT-d.
Per core performance is not on par with i5 3570k.What i5 3570k has is good all round performance i.e.good single threaded as well as multi-threaded performance.I don't think any of these chips can be termed overall bad or something.
ericjohn004's point on gaming and how it is tested on some sites is right.Like many sites test on 1280*800 to show CPU side of things but that is not relevant to most.


1) I already gave you a Borderlands 2 benchmark in the same message that you are replying now. It gives a difference of about 3 FPS between the FX-8350 and the expensive i7-3770K playing at @ 1080p. Taking into account the standard error, this means a tie between the two processors.

You ask me to check Borderlands 2 at xbitlabs link. Ok. Here is a first analysis. Their 'review' was run with the AMD processors at stock speeds but with the Intel chips run with overclocked ram by about a 17% (in MHz). Moreover they used a memory kit with an Intel XMP profile #1, but they did not run any AMD memory profile (AMP).

They used a top of game ASUS P8Z77-V Deluxe for the Intel chips, but a ASUS Crosshair V Formula for the AMD chips; precisely the V Formula-Z was released with improvements for the Piledriver architecture; moreover, they run Windows 8 and whereas the Deluxe is Windows 8 ready the Formula is not (only the Formula Z is Windows 8 ready). After all this they find that the i7-3770K gives about 9 FPS more than the FX-8350 playing @ 1080p. Subtract the effect from overclocked ram and you get about 7-8 FPS. Taking into account the standard error, this means again a tie between the two processors.

2) I know I said this to you before, but I will do again and the last time:

The FX-8350 is a good gaming platform albeit current games use roughly a 25-50% of the eight-core chip. An entire new generation of eight-core games is being developed thanks to future consoles such as the PS4, which is based in an eight-core chip from AMD. Sony is already selling PS4 development kits based in an eight-core FX chip to game developers. Therefore, the FX-8350 will be a better gaming platform in the future.

3) I already explained to you why the Idontcare test was biased. I asked you for a confirmation that 3DS MAX is taking all FX cores to maximum, but you have not given me any (so far as I know it is optimized for Intel chips but not for AMD chips).

4) I already answered the OC stuff.

5) You continue missing an important point I will remark this again and the last time:

Although many current applications on windows are not 'heavily' multi-threaded and do not use eight-cores chips, many FX-8350 owners like to run two or three applications at once. It is not unusual that a FX-8350 owner was gaming whereas doing some work at background.

Do you need some owner comments to be posted here?


For the game AMD set up is already using 1866MHz RAM.These are some really non sense points you are putting mate.The whole intel xmp thing you are pointing and saying that intel set up was running at better clocks and it gained difference(8fps) becoz of that is pretty useless point.Nobody said fx8350 is a bad chip for gaming,it is that only in some games like skyrim,star craft 2,hitman absolution it fells back from the likes of i5/i7 ib processors by some amount.
Future games might be much more multithreaded in nature especially if they do physics on CPU but that doesn't mean we should not take into context what we have now.Xenon in xbox360 was a 3core 6 thread processor but still games at that time and to a long time after that didn't use that many threads simply becoz the OS and layers are much different in a console to a pc OS.Does PC have unified memory for CPU and GPU?The consoles like xbox360 could play pretty higly graphics demanding game with such low hardware by today's standard.Also 4 cores for games are not less for games what we have now and don't forget it can be overclocked as well.
I agree that users can use several programs at a point but that may not be always the case and we also have to look at performance in individual apps as well and it is not as if 4 cores can't mutli task at all.Still many of the things are single threaded and thus single threaded performance and can't be neglected for most users.It is better to see both sides of coin and acknowledge what is better for what.3D rendering on professional software scales well with cores and can tax out all cores to maximum easily which is quite obvious,Idontcare in his test also said so,you could have noticed that if you had read well.And by saying that every X or Y is not optimized for this and that doesn't change the situation.3DS MAX is a very popular 3D tool among professional and irrespective of what you think it would be used by them.Anyway his tests are pretty clear and informative and no way biased at all.I have read some of the tests done by him and I do believe his test results,you are free to differ.
I have put my points well enough in my earlier posts.I am not in support of any product or anything and will always choose the better one which would work for my type of tasks irrespective of the brand.I had made my points on the context that was gaming and also on other things and I feel I was clear enough and I did give some benches and there are quite a few which proves the point.
Enough deviation from the context.

April 4, 2013 4:10:00 PM

whyso said:
The difference (assuming optimized software) will only be as great as the theoretical performance difference. I know sandra is only a theoretical test but generally the results seem to scale to heavily threaded workloads fairly well (Its not a one to one ratio but rather something like 0.75). FX will probably have a better time hitting that theoretical maximum because modules> hyperthreading. But as the review shows there are a few tests where the 8350 beats the i7, in many it still looses. http://www.anandtech.com/bench/Product/551?vs=697

http://www.tomshardware.com/reviews/fx-8350-vishera-rev...

The results generally support this. On average the 8350 is pretty much right between the i5 and the i7 in multithreaded tasks (but generally closer to the i7).


I already provided a set of modern openbenchmarks showing how the FX-8350 is faster than the i7-3770k.
April 4, 2013 4:21:24 PM

drinvis said:

For the game AMD set up is already using 1866MHz RAM.These are some really non sense points you are putting mate.The whole intel xmp thing you are pointing and saying that intel set up was running at better clocks and it gained difference(8fps) becoz of that is pretty useless point.Nobody said fx8350 is a bad chip for gaming,it is that only in some games like skyrim,star craft 2,hitman absolution it fells back from the likes of i5/i7 ib processors by some amount.
Future games might be much more multithreaded in nature especially if they do physics on CPU but that doesn't mean we should not take into context what we have now.Xenon in xbox360 was a 3core 6 thread processor but still games at that time and to a long time after that didn't use that many threads simply becoz the OS and layers are much different in a console to a pc OS.Does PC have unified memory for CPU and GPU?The consoles like xbox360 could play pretty higly graphics demanding game with such low hardware by today's standard.Also 4 cores for games are not less for games what we have now and don't forget it can be overclocked as well.
I agree that users can use several programs at a point but that may not be always the case and we also have to look at performance in individual apps as well and it is not as if 4 cores can't mutli task at all.Still many of the things are single threaded and thus single threaded performance and can't be neglected for most users.It is better to see both sides of coin and acknowledge what is better for what.3D rendering on professional software scales well with cores and can tax out all cores to maximum easily which is quite obvious,Idontcare in his test also said so,you could have noticed that if you had read well.And by saying that every X or Y is not optimized for this and that doesn't change the situation.3DS MAX is a very popular 3D tool among professional and irrespective of what you think it would be used by them.Anyway his tests are pretty clear and informative and no way biased at all.I have read some of the tests done by him and I do believe his test results,you are free to differ.
I have put my points well enough in my earlier posts.I am not in support of any product or anything and will always choose the better one which would work for my type of tasks irrespective of the brand.I had made my points on the context that was gaming and also on other things and I feel I was clear enough and I did give some benches and there are quite a few which proves the point.
Enough deviation from the context.


1) Either you didn't read the post that you are answering or you grossly misinterpreted it.

2) I notice there is no answer to my questions again.

3) I already corrected several of your points before. No worth to repeat.
a b à CPUs
April 4, 2013 6:39:26 PM

juanrga said:
whyso said:
The difference (assuming optimized software) will only be as great as the theoretical performance difference. I know sandra is only a theoretical test but generally the results seem to scale to heavily threaded workloads fairly well (Its not a one to one ratio but rather something like 0.75). FX will probably have a better time hitting that theoretical maximum because modules> hyperthreading. But as the review shows there are a few tests where the 8350 beats the i7, in many it still looses. http://www.anandtech.com/bench/Product/551?vs=697

http://www.tomshardware.com/reviews/fx-8350-vishera-rev...

The results generally support this. On average the 8350 is pretty much right between the i5 and the i7 in multithreaded tasks (but generally closer to the i7).


I already provided a set of modern openbenchmarks showing how the FX-8350 is faster than the i7-3770k.


I don't know about intel crippling amd but look at it this way. Its virtually the same thing as physx and nvidia (which is disgusting but nothing new). The fact that intel performs so well is because it spends so much money developing compilers. Intel has said they hire more software engineers that hardware engineers and that is also the reason for their success. These compilers are like drivers, if you develop the best and cheapest and easiest to use and support them then your cpu is going to do the best. AMD is not spending this money so they show less gain. This does make intel's igp drivers horrendously bad in comparison.

Maybe, in those openbenchmark tests the 8350 does well (I don't really understand them and their accuracy since EVERY other test i've seen out there shows amd losing to the i7 in x264 encode first pass and winning slightly in the second pass) but openbenchmark is a fairy vague site. The reviews are done on completely different systems. Generally when you do a comparison you throw out the extreme outliers on either side. Generally if you are looking for accuracy anandtech, tech report, toms hardware, etc. would be better.
April 5, 2013 8:59:22 AM

whyso said:

I don't know about intel crippling amd but look at it this way. Its virtually the same thing as physx and nvidia (which is disgusting but nothing new). The fact that intel performs so well is because it spends so much money developing compilers. Intel has said they hire more software engineers that hardware engineers and that is also the reason for their success. These compilers are like drivers, if you develop the best and cheapest and easiest to use and support them then your cpu is going to do the best. AMD is not spending this money so they show less gain. This does make intel's igp drivers horrendously bad in comparison.

Maybe, in those openbenchmark tests the 8350 does well (I don't really understand them and their accuracy since EVERY other test i've seen out there shows amd losing to the i7 in x264 encode first pass and winning slightly in the second pass) but openbenchmark is a fairy vague site. The reviews are done on completely different systems. Generally when you do a comparison you throw out the extreme outliers on either side. Generally if you are looking for accuracy anandtech, tech report, toms hardware, etc. would be better.


1) Software compiled with the Intel compiler or the Intel function libraries has inferior performance on AMD and VIA processors. The reason is not that the processors are inferior. The reason is that the compiler or library can make multiple versions of a piece of code, each optimized for a certain processor and instruction set, for example SSE2, SSE3, etc. The system includes a function that detects which type of CPU it is running on and chooses the optimal code path for that CPU. This is called a CPU dispatcher. However, the Intel CPU dispatcher does not only check which instruction set is supported by the CPU, it also checks the vendor ID string. If the vendor string says "GenuineIntel" then it uses the optimal code path. If the CPU is not from Intel then, in most cases, it will run the slowest possible version of the code, even if the CPU is fully compatible with a better version.

Several people have complained about this behavior for years, but Intel have refused to change their CPU dispatcher. If Intel had advertised their compiler as compatible with Intel processors only, then there would probably be no complaints. The problem is that they are trying to hide what they are doing. Many software developers think that the compiler is compatible with AMD processors, and in fact it is, but unbeknownst to the programmer it puts in a biased CPU dispatcher that chooses an inferior code path whenever it is running on a non-Intel processor. If programmers knew this fact they would probably use another compiler. Who wants to sell a piece of software that doesn't work well on AMD processors?

Here you can see a demonstration using VIA processor and PCMark. Using the biased suite the VIA processor seems to be inferior to the Intel chip, until you change the vendor ID string to "GenuineIntel", then the VIA processor wins the Intel processor

http://arstechnica.com/gadgets/2008/07/atom-nano-review...

AMD has won a court case against Intel by the compiler cheating

http://www.osnews.com/story/22683/Intel_Forced_to_Remov...

There are more biased benchmark suites such as SYSMARK

http://semiaccurate.com/2011/06/20/nvidia-amd-and-via-q...

Look what Intel has been doing with 3DMark Vantage as well

http://techreport.com/review/17732/intel-graphics-drive...

2) I already showed how the Anandtech reviews are particularly bad... but google "Anandtech bias" and google query predictions script will suggest you four or five typical search strings :-) I don't need to say more.
April 5, 2013 9:37:59 AM

Flomps said:
i have more or less decided im getting it but is it good fo gaming? can it handle games like battlefield 3 etc on medium-high settings. im going to do a lot of multitasking in the background but this is the main thing.

is it any better than the i5 3470 or I5 3570K? i dont care about performance per watt or overall price, just how fast it performs

EDIT: this time for performance per watt and overall price, is it any better then the i7 3770k cpu?


Save money, get the FX-6300.
a b à CPUs
April 5, 2013 11:32:34 AM

juanrga said:
whyso said:

I don't know about intel crippling amd but look at it this way. Its virtually the same thing as physx and nvidia (which is disgusting but nothing new). The fact that intel performs so well is because it spends so much money developing compilers. Intel has said they hire more software engineers that hardware engineers and that is also the reason for their success. These compilers are like drivers, if you develop the best and cheapest and easiest to use and support them then your cpu is going to do the best. AMD is not spending this money so they show less gain. This does make intel's igp drivers horrendously bad in comparison.

Maybe, in those openbenchmark tests the 8350 does well (I don't really understand them and their accuracy since EVERY other test i've seen out there shows amd losing to the i7 in x264 encode first pass and winning slightly in the second pass) but openbenchmark is a fairy vague site. The reviews are done on completely different systems. Generally when you do a comparison you throw out the extreme outliers on either side. Generally if you are looking for accuracy anandtech, tech report, toms hardware, etc. would be better.


1) Software compiled with the Intel compiler or the Intel function libraries has inferior performance on AMD and VIA processors. The reason is not that the processors are inferior. The reason is that the compiler or library can make multiple versions of a piece of code, each optimized for a certain processor and instruction set, for example SSE2, SSE3, etc. The system includes a function that detects which type of CPU it is running on and chooses the optimal code path for that CPU. This is called a CPU dispatcher. However, the Intel CPU dispatcher does not only check which instruction set is supported by the CPU, it also checks the vendor ID string. If the vendor string says "GenuineIntel" then it uses the optimal code path. If the CPU is not from Intel then, in most cases, it will run the slowest possible version of the code, even if the CPU is fully compatible with a better version.

Several people have complained about this behavior for years, but Intel have refused to change their CPU dispatcher. If Intel had advertised their compiler as compatible with Intel processors only, then there would probably be no complaints. The problem is that they are trying to hide what they are doing. Many software developers think that the compiler is compatible with AMD processors, and in fact it is, but unbeknownst to the programmer it puts in a biased CPU dispatcher that chooses an inferior code path whenever it is running on a non-Intel processor. If programmers knew this fact they would probably use another compiler. Who wants to sell a piece of software that doesn't work well on AMD processors?

Here you can see a demonstration using VIA processor and PCMark. Using the biased suite the VIA processor seems to be inferior to the Intel chip, until you change the vendor ID string to "GenuineIntel", then the VIA processor wins the Intel processor

http://arstechnica.com/gadgets/2008/07/atom-nano-review...

AMD has won a court case against Intel by the compiler cheating

http://www.osnews.com/story/22683/Intel_Forced_to_Remov...

There are more biased benchmark suites such as SYSMARK

http://semiaccurate.com/2011/06/20/nvidia-amd-and-via-q...

Look what Intel has been doing with 3DMark Vantage as well

http://techreport.com/review/17732/intel-graphics-drive...

2) I already showed how the Anandtech reviews are particularly bad... but google "Anandtech bias" and google query predictions script will suggest you four or five typical search strings :-) I don't need to say more.


? That was several years ago (more than three) and there is no evidence that that is occuring any more. We can't look into an incident in the past and assume that it is occuring to the present.

I can type _______ bias (anything in the blank) into google and get something. If anything I would say techreport is fairly biased. There amd stuttering article used a stock 7950 vs a highly overclocked 660ti (and bashed amd) and often they make comparisons of non-stock models.
a b à CPUs
April 5, 2013 1:32:47 PM

Anyone that thinks an 8350 is future proof needs to wake up. Yes the 8350 is a great processor, but do you really think you can have ANY cpu for 3-4 years and it still be competing? The answer is a big NO! I hear all this talk about the future. But in the future people will have way better processors than an 8350 and a 3570k. I know I'll have something better.

So the argument that AMD is more "future proof" really gets thrown out the window because nothing is future proof. I know the PS4 is COMING out, meaning it's not out yet so the games for it aren't even out. And even games that are optimized for 8 cores, aren't optimized enough because you still can't see an advantage on an AMD chip yet. Maybe that'll change, but it won't be within 2013. And by 2014 we won't even be talking about an 8350 anymore. It'll be old news. The 3570k is almost already old news with Haswell coming out.

With the VERY minor performance improvement from Haswel I hope AMD gets a nice leg up this way Intel will have to match them. We get better products with competition. Anyone who hates AMD, hates great deals on hardware and high performance CPU's. I love me some AMD and I hope they drastically improve IPC so they'll compete even closer to Intel in the future.
April 5, 2013 5:31:31 PM

whyso said:

? That was several years ago (more than three) and there is no evidence that that is occuring any more. We can't look into an incident in the past and assume that it is occuring to the present.


Your answer is vague and I do not know to what are you referring to, but if you mean the biased CPU dispatcher on Intel compiler, I can sure you that it continues cheating today

http://koukishinneko.com/tag/intel-compiler-patcher/

The only difference now is that the US Federal Trade Commission obligated to Intel to introduce a footnote disclaimer in its description

http://en.wikipedia.org/wiki/Intel_C%2B%2B_Compiler#Cri...
a c 210 à CPUs
April 5, 2013 5:32:12 PM

ericjohn004 said:
Anyone that thinks an 8350 is future proof needs to wake up. Yes the 8350 is a great processor, but do you really think you can have ANY cpu for 3-4 years and it still be competing? The answer is a big NO! I hear all this talk about the future. But in the future people will have way better processors than an 8350 and a 3570k. I know I'll have something better.

So the argument that AMD is more "future proof" really gets thrown out the window because nothing is future proof. I know the PS4 is COMING out, meaning it's not out yet so the games for it aren't even out. And even games that are optimized for 8 cores, aren't optimized enough because you still can't see an advantage on an AMD chip yet. Maybe that'll change, but it won't be within 2013. And by 2014 we won't even be talking about an 8350 anymore. It'll be old news. The 3570k is almost already old news with Haswell coming out.

With the VERY minor performance improvement from Haswel I hope AMD gets a nice leg up this way Intel will have to match them. We get better products with competition. Anyone who hates AMD, hates great deals on hardware and high performance CPU's. I love me some AMD and I hope they drastically improve IPC so they'll compete even closer to Intel in the future.


The future proof sentiment comes from the direct upgrade path to steamroller without a socket change...not from the FX8350 being so superior it will be around for 3-4 years and be dominant then. That would be fool's errand to assume. But steamroller will be AM3+ socket, so you can upgrade directly.
a c 210 à CPUs
April 5, 2013 5:33:29 PM

juanrga said:
whyso said:

? That was several years ago (more than three) and there is no evidence that that is occuring any more. We can't look into an incident in the past and assume that it is occuring to the present.


Your answer is vague and I do not know to what are you referring to, but if you mean the biased CPU dispatcher on Intel compiler, I can sure you that it continues cheating today

http://koukishinneko.com/tag/intel-compiler-patcher/

The only difference now is that the US Federal Trade Commission obligated to Intel to introduce a footnote disclaimer in its description

http://en.wikipedia.org/wiki/Intel_C%2B%2B_Compiler#Cri...


Cinebench still runs on ICC...you left that one out.
April 5, 2013 5:48:30 PM

Sony is selling PS4 development kits based in an eight-core FX chip.

I wonder if one of those kits has been used by Epic Games for the unreal 4 demo that they presented some days ago.
a b à CPUs
April 5, 2013 11:07:33 PM

juanrga said:
bigj1985 said:
All this arguing about which processor is superior or if AMD can keep up with Intel is a moot point. Current Sandy/Ivy bridge chips destroy all FX chips when both processor are at same clocks. Why same clocks? Because both have identical overclocking capabilities and therefore they should be compared as such. Most people don't buy an unlocked CPU to leave at stock

An i5 at the SAME CLOCK as an FX 8350 will absolutely smoke it in anything using 4 cores or less and most likely come slightly behind or even mildly ahead in some multi threaded aps.

An i7 at the SAME CLOCK as an FX 8350 will DEMOLISH the FX is virtually everything outside of a very few select programs. And it will do all of this while using HALF the power and a POS $25 heatsink from NewEgg. Meanwhile smoke bellows from the H100 attempting to quench the power hungry beast of an FX chip.

Lets quit acting like these chips are even in the same league here.



1) Comparing a stock FX to an overcloked Intel chip is biased.

2) The FX has superior overclocking capabilitites. It is easier to overclock, more stable, and has the world record on overclocking.

3) Rest of your points were addressed before as well.



1) Who ever said anything about copmparing an overclocked Intel with a stock FX? I made it blatantly clear that the chips should be measured up at the same clocks when comparing performance. Not sure where you got this from.

2) Superior overclocking abilities using what? Liquid Nitrogen? Does anyone give a Hoot about liquid Nitrogen? no. They don't. No1 uses extreme cooling methods like that for an everyday 24/7 gaming/productivity PC. Both chips have pretty much the same OC abilities when it comes to using modern conventional cooling systems. Most people can maintain a 4.8-5.0ghz OC on an FX. No1 is getting a 5.6 ghz overclock on a 24/7 FX. It's just not happening aside from a very few extremely lucky peices of Silicone. Hell a bet most people can't even maintain a 5ghz OC on an FX under air. And no an FX is "Not more Stable" when it's overclocked that's just BS talk.

3) No they weren't

a b à CPUs
April 5, 2013 11:14:40 PM

is fx-8350 good for gaming ? YES
is it better than i5 for gaming ? NO
WHY ? because it gives lower performance than i5 while consuming more power.
so should you buy fx-8350 for gaming ?
YES if you already own an older AM3 setup it will be a good upgrade
NO if you own either Intel rig or are building a new pc specifically for gaming.
a b à CPUs
April 5, 2013 11:15:21 PM

8350rocks said:
bigj1985 said:
All this arguing about which processor is superior or if AMD can keep up with Intel is a moot point. Current Sandy/Ivy bridge chips destroy all FX chips when both processor are at same clocks. Why same clocks? Because both have identical overclocking capabilities and therefore they should be compared as such. Most people don't buy an unlocked CPU to leave at stock

An i5 at the SAME CLOCK as an FX 8350 will absolutely smoke it in anything using 4 cores or less and most likely come slightly behind or even mildly ahead in some multi threaded aps.

An i7 at the SAME CLOCK as an FX 8350 will DEMOLISH the FX is virtually everything outside of a very few select programs. And it will do all of this while using HALF the power and a POS $25 heatsink from NewEgg. Meanwhile smoke bellows from the H100 attempting to quench the power hungry beast of an FX chip.

Lets quit acting like these chips are even in the same league here.



Ok, OC either one of those to 5.2-5.6 GHz....then talk about OC'ing potential being equal.





They are equal. Under conventional cooling methods. 99% of people are not able to maintain any type of stability outside of a quick benchmark run with an FX running at 5.2-5.6 ghz. It's NOT happening. I know for a fact stability and heat issues make it extremely difficult to even get an 8350 to run at 5ghz w/o extreme cooling methods.

I know most 2600k owners can easily reach 5 ghz with 24/7 stability. So yes, all in all under practical cooling chips are pretty much on equal ground
a b à CPUs
April 5, 2013 11:34:10 PM

mohit9206 said:
is fx-8350 good for gaming ? YES
is it better than i5 for gaming ? NO
WHY ? because it gives lower performance than i5 while consuming more power.
so should you buy fx-8350 for gaming ?
YES if you already own an older AM3 setup it will be a good upgrade
NO if you own either Intel rig or are building a new pc specifically for gaming.

+1
April 6, 2013 7:13:01 AM

bigj1985 said:

1) Who ever said anything about copmparing an overclocked Intel with a stock FX? I made it blatantly clear that the chips should be measured up at the same clocks when comparing performance. Not sure where you got this from.


Your insistence on comparing Intel and AMD chips only at the same clocks include the case (already made in several 'reviews' presented here) where the AMD chips are run at clock speeds but the Intel ones are overclocked.

Let me be clear on this, comparing two overclocked chips where the Intel chip is more overclocked than the AMD chip is biased as well.

bigj1985 said:

2) Superior overclocking abilities using what? Liquid Nitrogen? Does anyone give a Hoot about liquid Nitrogen? no. They don't. No1 uses extreme cooling methods like that for an everyday 24/7 gaming/productivity PC. Both chips have pretty much the same OC abilities when it comes to using modern conventional cooling systems. Most people can maintain a 4.8-5.0ghz OC on an FX. No1 is getting a 5.6 ghz overclock on a 24/7 FX. It's just not happening aside from a very few extremely lucky peices of Silicone. Hell a bet most people can't even maintain a 5ghz OC on an FX under air. And no an FX is "Not more Stable" when it's overclocked that's just BS talk.


Using everything from pure air up to LN2. "Overclocking" incudes any overclocking from basic overclocking to extreme overclocking.

Like it or not, but the AMD FX has the world-record on overcloking with its eight-cores working above the 8 GHz. World-record does not mean that you will be achieving that at your home, but gives an indication of the superior overclocking capabilitites. I also explained about this before.
a b à CPUs
April 6, 2013 9:37:32 AM

juanrga said:
bigj1985 said:

1) Who ever said anything about copmparing an overclocked Intel with a stock FX? I made it blatantly clear that the chips should be measured up at the same clocks when comparing performance. Not sure where you got this from.


Your insistence on comparing Intel and AMD chips only at the same clocks include the case (already made in several 'reviews' presented here) where the AMD chips are run at clock speeds but the Intel ones are overclocked.

Let me be clear on this, comparing two overclocked chips where the Intel chip is more overclocked than the AMD chip is biased as well.

bigj1985 said:

2) Superior overclocking abilities using what? Liquid Nitrogen? Does anyone give a Hoot about liquid Nitrogen? no. They don't. No1 uses extreme cooling methods like that for an everyday 24/7 gaming/productivity PC. Both chips have pretty much the same OC abilities when it comes to using modern conventional cooling systems. Most people can maintain a 4.8-5.0ghz OC on an FX. No1 is getting a 5.6 ghz overclock on a 24/7 FX. It's just not happening aside from a very few extremely lucky peices of Silicone. Hell a bet most people can't even maintain a 5ghz OC on an FX under air. And no an FX is "Not more Stable" when it's overclocked that's just BS talk.


Using everything from pure air up to LN2. "Overclocking" incudes any overclocking from basic overclocking to extreme overclocking.

Like it or not, but the AMD FX has the world-record on overcloking with its eight-cores working above the 8 GHz. World-record does not mean that you will be achieving that at your home, but gives an indication of the superior overclocking capabilitites. I also explained about this before.


If comparing overclocking we are comparing max overclocking (or max safe overclocking). If one is a better overclocker than that should be taken into account. This is why amd gpus are generally superior to nvidia gpus. They have more overclocking headroom and scale better with overclocking. Generally we can boost a gcn gpu by a greater % than a kepler gpu at max overclocks. There is nothing biased against this at all.

The FX was only working one or two cores at that speed. It was running at over 2 volts. It was using liquid helium. thats hardly consumer representitive.
a c 210 à CPUs
April 6, 2013 11:52:47 AM


http://www.overclock.net/t/1318995/official-fx-8320-fx-...

There is a detailed information section in the first post showing stable overclocks and benchmarks run at that frequency as well as system settings, OS, mobo, BIOS, chipset, etc.

The top every day 24/7 OC is 5.6 GHz on FX8350. He uses H110i cooling with push/pull setup to keep it cool. Temps are documented.
April 6, 2013 12:07:21 PM

whyso said:
If comparing overclocking we are comparing max overclocking (or max safe overclocking). If one is a better overclocker than that should be taken into account. This is why amd gpus are generally superior to nvidia gpus. They have more overclocking headroom and scale better with overclocking. Generally we can boost a gcn gpu by a greater % than a kepler gpu at max overclocks. There is nothing biased against this at all.


You did not get the point. It was not about comparing overclocking capabilities of both chips but about him pretending that you only can compare Intel to AMD at the same clocks: e.g. an AMD at clock speed vs an overcloked Intel, which is not only ridiculous but also biased.

whyso said:

The FX was only working one or two cores at that speed. It was running at over 2 volts. It was using liquid helium. thats hardly consumer representitive.


No. I was referring to the eight-core record

http://news.softpedia.com/news/All-8-Cores-of-the-AMD-F...

And did you read what I said about the relation between a world-record and what one can obtain at home?
a b à CPUs
April 6, 2013 1:34:53 PM

juanrga said:
whyso said:
If comparing overclocking we are comparing max overclocking (or max safe overclocking). If one is a better overclocker than that should be taken into account. This is why amd gpus are generally superior to nvidia gpus. They have more overclocking headroom and scale better with overclocking. Generally we can boost a gcn gpu by a greater % than a kepler gpu at max overclocks. There is nothing biased against this at all.


You did not get the point. It was not about comparing overclocking capabilities of both chips but about him pretending that you only can compare Intel to AMD at the same clocks: e.g. an AMD at clock speed vs an overcloked Intel, which is not only ridiculous but also biased.

whyso said:

The FX was only working one or two cores at that speed. It was running at over 2 volts. It was using liquid helium. thats hardly consumer representitive.


No. I was referring to the eight-core record

http://news.softpedia.com/news/All-8-Cores-of-the-AMD-F...

And did you read what I said about the relation between a world-record and what one can obtain at home?


Sorry didn't see that, yep, either compare at stock or at both chips at the max safe overclock.

Sorry didn't see the eight core record. The articles I looked up had only two cores at 8+ ghz.

I't doesn't matter how superior the overclocking abilities are if it is not doable. The cell cpu in the ps3 was supposed to be a beast with much greater power than the competition that ultimately fell flat because it was hard to code for. Any enthusiast is going to want an over clock that they can safely achieve at home (on air overclock is ultimately going to matter the most because that is what the greatest majority of overclocking are going to be using) over something that requires liquid nitrogen.
a b à CPUs
April 6, 2013 6:20:13 PM

ericjohn004 said:
Anyone that thinks an 8350 is future proof needs to wake up. Yes the 8350 is a great processor, but do you really think you can have ANY cpu for 3-4 years and it still be competing? The answer is a big NO! I hear all this talk about the future. But in the future people will have way better processors than an 8350 and a 3570k. I know I'll have something better.

So the argument that AMD is more "future proof" really gets thrown out the window because nothing is future proof. I know the PS4 is COMING out, meaning it's not out yet so the games for it aren't even out. And even games that are optimized for 8 cores, aren't optimized enough because you still can't see an advantage on an AMD chip yet. Maybe that'll change, but it won't be within 2013. And by 2014 we won't even be talking about an 8350 anymore. It'll be old news. The 3570k is almost already old news with Haswell coming out.

With the VERY minor performance improvement from Haswel I hope AMD gets a nice leg up this way Intel will have to match them. We get better products with competition. Anyone who hates AMD, hates great deals on hardware and high performance CPU's. I love me some AMD and I hope they drastically improve IPC so they'll compete even closer to Intel in the future.


8350rocks said:

http://www.overclock.net/t/1318995/official-fx-8320-fx-...

There is a detailed information section in the first post showing stable overclocks and benchmarks run at that frequency as well as system settings, OS, mobo, BIOS, chipset, etc.

The top every day 24/7 OC is 5.6 GHz on FX8350. He uses H110i cooling with push/pull setup to keep it cool. Temps are documented.


He hit the silicone lottery. I can show you videos of those lucky enough to get a 5.4ghz+ Sandy stable for 24/7 (and it's not very rare BTW). But that hardly means anything. My point being most people cannot even hit 5.2 ghz stable on an FX for 24/7 use much less 5.6. Therefore it's not representative of what a consumer can reasonable expect.

I've followed the FX threads for awhile and looked into it extensively. 99% of people are not hitting 5.6 ghz and even IF they were the power consumtion would be absolutely horrendous. To the point where you might have to upgrade you PSU Lol
April 6, 2013 7:51:38 PM

Lots of people have stability issues for your i7-2600k @ 4.5 and many more @ 4.8 GHz, and ivy Bridge overclocking possibilities are poor.

However, FXs running stable 24/7 @ 5.2 GHz are relatively common. No strange that your i7 was not selected as the best overclocking CPU

http://www.bestcovery.com/amd-amd-fx-8120-8-core-proces...
a c 210 à CPUs
April 6, 2013 8:03:28 PM

You know, his voltage is only 1.48 on the 5.6 GHz OC, and several others have OC's of 5.2 GHz. I can find other CPU-Z verified OCs that high pretty easily, but that site is by far the most documented and well organized.
April 6, 2013 11:12:46 PM


Well, if the i5-3570K is so much faster at everything, then please explain why :

http://openbenchmarking.org/result/1210227-RA-AMDFX8350...

SMP NAS server -100% faster in SMP NAS testing vs i5-2400. 60% faster than the i5-2500k. 35% faster than the i5-3470 (noted : 15% slower than the 3770k).

John the ripper (DVD ripping) - it beats the i7-3770k by almost 20%, more than twice as fast as the i5-2500k and nearly twice the speed of the i5-3470.

Look at the Linux compile time. 8350 = 82s. I5-2500k =116.14s. i5-3470 = 114.25. The 8350 is almost 40% faster than the i5-3470 here. Also faster than the i7-3770K (by just a few seconds).

These are threaded and well balanced usage cases.

Bottom line is, if you have either a multi-threaded application that balances the threads well, OR you have multiple things going on at once which will load more than 4 cores - the AMD tends to win.

On single-threaded tasks, yes Intel wins most of the time.

But single threaded is not the future, nor is it even the present.

We have been at the point where more well-coded highly threaded applications are necessary to continue to process increasingly complex problems and data more quickly for almost a decade. IOW the future is now.

This is a fairly famous article from ~8 years ago (2005); what the author here is talking about is in fact happening. Witness the very incremental single thread performance increases for the last 3 processor generations. The curve is flattening out. The only way to get more performance is to utilize more cores.

http://www.gotw.ca/publications/concurrency-ddj.htm

"...applications will increasingly need to be concurrent if they want to fully exploit CPU throughput gains that have now started becoming available and will continue to materialize over the next several years. For example, Intel is talking about someday producing 100-core chips; a single-threaded application can exploit at most 1/100 of such a chip’s potential throughput. “Oh, performance doesn’t matter so much, computers just keep getting faster” has always been a naïve statement to be viewed with suspicion, and for the near future it will almost always be simply wrong."
April 7, 2013 8:18:45 AM

if u want to overclock go for i5-3570k else go with i5-3470. intel cpu's are reliable
a c 210 à CPUs
April 7, 2013 8:43:45 AM

Shrish said:
if u want to overclock go for i5-3570k else go with i5-3470. intel cpu's are reliable


Areyou really insinuating that AMD processors are not "reliable"?
April 7, 2013 4:14:04 PM

Shrish said:
if u want to overclock go for i5-3570k else go with i5-3470. intel cpu's are reliable


Yes Intel cpu's are reliable, but ivy bridge chips have thermal issues due to poor quality IHS, and this difficulties overclocking. Some people is able to overclock satisfactorily when they manually remove the IHS from the chip, but others continue having thermal issues even after the removal.

AMD chips are so reliable that are used in servers and in top supercomputers such as Titan and Jaguar. AMD chips have better overclocking capabilities. The world record of overclocking was obtained by an AMD FX-8350: eight cores above the 8 GHZ. No Intel chip can achieve that level of overclocking due to a low-quality design.

a b à CPUs
April 7, 2013 6:12:29 PM

juanrga said:
Shrish said:
if u want to overclock go for i5-3570k else go with i5-3470. intel cpu's are reliable


Yes Intel cpu's are reliable, but ivy bridge chips have thermal issues due to poor quality IHS, and this difficulties overclocking. Some people is able to overclock satisfactorily when they manually remove the IHS from the chip, but others continue having thermal issues even after the removal.

AMD chips are so reliable that are used in servers and in top supercomputers such as Titan and Jaguar. AMD chips have better overclocking capabilities. The world record of overclocking was obtained by an AMD FX-8350: eight cores above the 8 GHZ. No Intel chip can achieve that level of overclocking due to a low-quality design.



The second paragraph is complete bull. Supercomputers use all types of chips. Overclocking to 8 ghz is like the mhz myth, It doesn't matter how high you con go in theory. what matters is how high you can go in practice for the average overclocker.
Low-quality design? Someone is forgetting exactly how much power an overclocked 8350 is using vs a overclocked i7-3770k.

Ivy bridge does have heat problems because of intel's stupid decision to try to save a few dollars but to call their design low-end is simply wrong. If you remove price out of the equation the fx-8350 simply cannot compete with the i7-3770k.
April 7, 2013 10:01:50 PM

whyso said:
juanrga said:
Shrish said:
if u want to overclock go for i5-3570k else go with i5-3470. intel cpu's are reliable


Yes Intel cpu's are reliable, but ivy bridge chips have thermal issues due to poor quality IHS, and this difficulties overclocking. Some people is able to overclock satisfactorily when they manually remove the IHS from the chip, but others continue having thermal issues even after the removal.

AMD chips are so reliable that are used in servers and in top supercomputers such as Titan and Jaguar. AMD chips have better overclocking capabilities. The world record of overclocking was obtained by an AMD FX-8350: eight cores above the 8 GHZ. No Intel chip can achieve that level of overclocking due to a low-quality design.


The second paragraph is complete bull. Supercomputers use all types of chips.


Really? Also non-reliable ones?

whyso said:

Overclocking to 8 ghz is like the mhz myth, It doesn't matter how high you con go in theory. what matters is how high you can go in practice for the average overclocker.


How many times I need to explain this to you?

whyso said:

Low-quality design? Someone is forgetting exactly how much power an overclocked 8350 is using vs a overclocked i7-3770k.

Ivy bridge does have heat problems because of intel's stupid decision to try to save a few dollars but to call their design low-end is simply wrong.


Therefore you think that "intel's stupid decision" is not part of the design of the chip? Does it belong to the marketing dept.?

whyso said:

If you remove price out of the equation the fx-8350 simply cannot compete with the i7-3770k.


Do you mean how when the FX-8350 beats the i7-3770k on the performance test provided to you before?
a b à CPUs
April 8, 2013 5:46:47 AM

juanrga said:
whyso said:
juanrga said:
Shrish said:
if u want to overclock go for i5-3570k else go with i5-3470. intel cpu's are reliable


Yes Intel cpu's are reliable, but ivy bridge chips have thermal issues due to poor quality IHS, and this difficulties overclocking. Some people is able to overclock satisfactorily when they manually remove the IHS from the chip, but others continue having thermal issues even after the removal.

AMD chips are so reliable that are used in servers and in top supercomputers such as Titan and Jaguar. AMD chips have better overclocking capabilities. The world record of overclocking was obtained by an AMD FX-8350: eight cores above the 8 GHZ. No Intel chip can achieve that level of overclocking due to a low-quality design.


The second paragraph is complete bull. Supercomputers use all types of chips.


Really? Also non-reliable ones?

whyso said:

Overclocking to 8 ghz is like the mhz myth, It doesn't matter how high you con go in theory. what matters is how high you can go in practice for the average overclocker.


How many times I need to explain this to you?

whyso said:

Low-quality design? Someone is forgetting exactly how much power an overclocked 8350 is using vs a overclocked i7-3770k.

Ivy bridge does have heat problems because of intel's stupid decision to try to save a few dollars but to call their design low-end is simply wrong.


Therefore you think that "intel's stupid decision" is not part of the design of the chip? Does it belong to the marketing dept.?

whyso said:

If you remove price out of the equation the fx-8350 simply cannot compete with the i7-3770k.


Do you mean how when the FX-8350 beats the i7-3770k on the performance test provided to you before?


Yes, intel actually had a slide a few years ago where they were boasting how they could cut down the quality of the stock heatsinks and save a few dollars.

Please don't cherrypick a few tests. The ones you gave me were for the most part, very,very close.

And yes, remove price out of the equation and most people are going to go with the i7. Much lower power consumption and fairly equivalent multithread performance with much greater singlethread performance. Also has integrated graphics and quicksync (so you can save a few dollars if you are not doing anything that requires a gpu).

Supercomputers use all sorts of chips. Jaguar using opterons was succeeded by a computer using xenons. Many supercomputers are using other chips from other companies such as IBM. Furthermore, the cpu is pretty much the most reliable part in the computer, the chances of a cpu failing in a consumer computer (assuming you are not overclocking) is pretty much nil. Much greater chance that the motherboard is going to go.
Supercomputers do not overclock the cpu's so the argument that one is more reliable than the other because it can go to extreme frequencies is moot. What matters is reliability at the conditions they are operating in (stock) for a supercomputer.

And yes, 8ghz under extreme conditons and extremely high voltages is worthless to the average person. Given two cpus with equal IPC, I'd rather have one that could hit a max of 6 ghz under liquid helium but consistently hit 4.5 ghz under air than one that could hit 8 ghz under liquid helium but only 4 ghz under air (these are imaginary cpus). Some might say that the first is better than the second because even though max performance is lower the extractable performance is higher.

This is kinda like the idea (the idea here not the specifics) of a souped up corvette vs an average car. The corvette can travel much faster (higher max overclocks) but both are limited by the speed limit (easy way to cool the chip, long term stability--2 volts is not going to last long, cost-liquid helium/nitrogen is expensive). In the end what is going to matter is performance at the speed limit (at the range where virtually everybody is going to be using the cpu).

FX is a good product at a great price but in the end people are going to care about what THEY can get out of the chip, not what the chip is supposedly capable of under extreme condition.
a c 210 à CPUs
April 8, 2013 8:36:23 AM

Well, under air cooling you can get the FX 8350 to 4.8 GHz, under water cooling, it would be feasible to break 5 GHz and I have seen prime95 and cinebench and 3dmark benchmarks run on a 5.6 GHz 24/7 OC on a FX8350. The voltage is only 1.48 and the OC is very stable.
a b à CPUs
April 8, 2013 11:28:55 AM

8350rocks said:
Well, under air cooling you can get the FX 8350 to 4.8 GHz, under water cooling, it would be feasible to break 5 GHz and I have seen prime95 and cinebench and 3dmark benchmarks run on a 5.6 GHz 24/7 OC on a FX8350. The voltage is only 1.48 and the OC is very stable.


Generally speaking you do not want to go over 1.4 volts on your cpu for longevity purposes. Generally FX can go to higher clocks than ivy bridge. However, it uses much more power when overclocked. Most ivy bridge cpus can go to 4.4-4.6 under air (vs 4.6-4.8 for 8350) and it is worth mentioning that their stock clock is lower. FX overclocks great but it uses a ton of power.
a c 210 à CPUs
April 8, 2013 11:34:28 AM

whyso said:
8350rocks said:
Well, under air cooling you can get the FX 8350 to 4.8 GHz, under water cooling, it would be feasible to break 5 GHz and I have seen prime95 and cinebench and 3dmark benchmarks run on a 5.6 GHz 24/7 OC on a FX8350. The voltage is only 1.48 and the OC is very stable.


Generally speaking you do not want to go over 1.4 volts on your cpu for longevity purposes. Generally FX can go to higher clocks than ivy bridge. However, it uses much more power when overclocked. Most ivy bridge cpus can go to 4.4-4.6 under air (vs 4.6-4.8 for 8350) and it is worth mentioning that their stock clock is lower. FX overclocks great but it uses a ton of power.


Yes, all true, but AMD came out and said for an "everyday" overclock do not exceed 1.55V as the architecture is not designed to run above that...so the voltage is within their specs for the OC on the 8350 mentioned above. I know intel is less friendly with high voltage OCs.
April 8, 2013 11:35:16 AM

whyso said:

Yes, intel actually had a slide a few years ago where they were boasting how they could cut down the quality of the stock heatsinks and save a few dollars.


"Yes" what? I did two questions.

whyso said:

Please don't cherrypick a few tests. The ones you gave me were for the most part, very,very close.


You cannot pretend to select a few biased tests {*} showing a 15% difference and say that the "fx8350 lagged behind", whereas say that the i7 was "very, very close" on tests where the FX was 30-70% faster.

{*} As showed before the real difference is unnoticeable.

whyso said:

Supercomputers use all sorts of chips. Jaguar using opterons was succeeded by a computer using xenons. Many supercomputers are using other chips from other companies such as IBM.


Jaguar (AMD based) was upgraded to Titan (AMD based).

#1 is Titan with a score of 17.590. You have to go to #5 in the ranking to find an Intel based supercomputer (Xeon) and this gives only a score of 2.897. Titan is 6x faster.

http://en.wikipedia.org/wiki/TOP500#Top_10_ranking

whyso said:

Supercomputers do not overclock the cpu's so the argument that one is more reliable than the other because it can go to extreme frequencies is moot.


Who gave you that argument?

whyso said:

And yes, 8ghz under extreme conditons and extremely high voltages is worthless to the average person. Given two cpus with equal IPC, I'd rather have one that could hit a max of 6 ghz under liquid helium but consistently hit 4.5 ghz under air than one that could hit 8 ghz under liquid helium but only 4 ghz under air (these are imaginary cpus).


And what about real processors? What about AMD FX being selected as best cpu for overclocking? What about owners of AMD achieving 5.0-5.2 GHz with easiness, whereas Intel owners reporting difficulties at 4.8? What about low voltages? What about well-known thermal issues with ivy Bridge?

whyso said:

FX is a good product at a great price but in the end people are going to care about what THEY can get out of the chip, not what the chip is supposedly capable of under extreme condition.


Do you not still understand that people who buy FX chips use them for both work and gaming. Did you see the benchmarks I gave to you? What do you believe NAS, C-ray, NAS, BZIP, 3DMAX, database search, JTR... benchmarks are for?
a b à CPUs
April 8, 2013 3:27:33 PM

juanrga said:
Shrish said:
if u want to overclock go for i5-3570k else go with i5-3470. intel cpu's are reliable


AMD chips are so reliable that are used in servers and in top supercomputers such as Titan and Jaguar. AMD chips have better overclocking capabilities. The world record of overclocking was obtained by an AMD FX-8350: eight cores above the 8 GHZ. No Intel chip can achieve that level of overclocking due to a low-quality design.

Pleae don't tell me that amd chips are reliable because they overclock well. What matters most for a supercomputer is performanceand reliability at the clockspeed they will be run at.



juanrga said:
whyso said:

Yes, intel actually had a slide a few years ago where they were boasting how they could cut down the quality of the stock heatsinks and save a few dollars.


"Yes" what? I did two questions.

whyso said:

Please don't cherrypick a few tests. The ones you gave me were for the most part, very,very close.


You cannot pretend to select a few biased tests {*} showing a 15% difference and say that the "fx8350 lagged behind", whereas say that the i7 was "very, very close" on tests where the FX was 30-70% faster.

{*} As showed before the real difference is unnoticeable.

whyso said:

Supercomputers use all sorts of chips. Jaguar using opterons was succeeded by a computer using xenons. Many supercomputers are using other chips from other companies such as IBM.


Jaguar (AMD based) was upgraded to Titan (AMD based).

#1 is Titan with a score of 17.590. You have to go to #5 in the ranking to find an Intel based supercomputer (Xeon) and this gives only a score of 2.897. Titan is 6x faster.

http://en.wikipedia.org/wiki/TOP500#Top_10_ranking

whyso said:

Supercomputers do not overclock the cpu's so the argument that one is more reliable than the other because it can go to extreme frequencies is moot.


Who gave you that argument?

whyso said:

And yes, 8ghz under extreme conditons and extremely high voltages is worthless to the average person. Given two cpus with equal IPC, I'd rather have one that could hit a max of 6 ghz under liquid helium but consistently hit 4.5 ghz under air than one that could hit 8 ghz under liquid helium but only 4 ghz under air (these are imaginary cpus).


And what about real processors? What about AMD FX being selected as best cpu for overclocking? What about owners of AMD achieving 5.0-5.2 GHz with easiness, whereas Intel owners reporting difficulties at 4.8? What about low voltages? What about well-known thermal issues with ivy Bridge?

whyso said:

FX is a good product at a great price but in the end people are going to care about what THEY can get out of the chip, not what the chip is supposedly capable of under extreme condition.


Do you not still understand that people who buy FX chips use them for both work and gaming. Did you see the benchmarks I gave to you? What do you believe NAS, C-ray, NAS, BZIP, 3DMAX, database search, JTR... benchmarks are for?


Anyone running a professional application is going to be leery of overclocking (not semi-professional but real professional). Gamers will overclock but people working and running NAS or database searches are more than not going to run the chip at stock (and if they are going to be running the chip at full load then power consumption is going to come into play). At stock we are using 75 watts more than the i7 and getting similar performance (within 5%). If that is running 24/7 then assuming electricity is 11 cents a kilowatt (0.075*24*365*0.11 =$72 a year in additional electricity consumption from the cpu alone--ignoring additional losses from the power supply(efficiency is a percentage of output) and additional costs to cool the case--AC). Now look at the overclocked power usage (generally going to be about 10% faster than the i7) and suppose power is twice the price (as it is in many countries). I use this argument to illustrate why anyone who is going to run professional applications 24/7 probably woudn't overclock and are concerned about cost over the lifetime of the machine. For the casual professional (not someone who has a rendering project on 24/7) the 8350 is a good deal and buy but if power consumption is coming into play then things change.





"Do you not still understand that people who buy FX chips use them for both work and gaming. Did you see the benchmarks I gave to you? What do you believe NAS, C-ray, NAS, BZIP, 3DMAX, database search, JTR... benchmarks are for?"

They are going to care about extractable performance.


Intel is a penny pincher who likes to nickle and dime their buyers with motherboard changes every couple years.

I said multiple times FX overclocks well. To put it in perspective what clocks is the average air-cooling enthusiast going to use. That is what matters the most. No one is getting 5.2 ghz with ease. 8350 tops out at 4.8 with an air cooler.

They can probably get about 4.7-4.8 on a good chip under air (from stock 4.0 (turbo 4.1)) that is an overclock of 17% (4.8/4.1). The 3770k can get around 4.4-4.5 ghz on a good chip (from a 3.8 turbo (regular 3.5)) on four cores. That is an overclock of 16-18%, which is roughly the same. The FX can hit higher clocks but the gain that both chips gain from overclocking is roughly the same (FX does not scale as well with frequency because the cache runs at a constant speed).


They got their i7 chip to 4.9 ghz (obviously an outlier).
http://techreport.com/review/22833/ivy-bridge-on-air-th...

Toms hardware review (bolded is mine)

Quote:
Using a 1.375 V CPU voltage and a 1.175 V northbridge voltage, I was able to get FX-8350 running stably at 4.8 GHz under full load. In the screen capture above, I'm running a single-threaded test to spin the chip up, but the highlighted maximum temperature is where our benchmark suite peaked.


Quote:
The FX-8350 wanted to go even faster, but the key here is a voltage setting low enough that you avoid hitting 70 degrees Celsius. At that point, the thermal monitor starts cycling cores to throttle down (evidenced in the image above), keeping the chip from getting any hotter and negatively impacting performance. So long as I didn’t trigger any threaded workloads, I was even able to run benchmarks as high as 5.125 GHz (requiring a 1.4375 V CPU voltage and 1.2 V northbridge setting).


Their system builder (not all chips are created equal --FX requires good cooling).

Quote:
This is the first time anyone at Tom's Hardware has tried his hand at overclocking a retail FX-8350. And, after reading Chris' experience taking his sample from AMD up above 5 GHz, I was looking forward to something similar. It turns out that I was being far too ambitious, though. Xigmatek's Loki doesn't have the headroom to keep the 125 W processor cool beyond its stock clock rates. Beyond performance, thermals are probably AMD's biggest disadvantage in this comparison. We really would need to spend a lot more on cooling to achieve any sort of meaningful overclock.

Regardless of the processor or northbridge voltages we used, we couldn't exceed 4.63 GHz. "Fair enough," I first thought. "If I disable Turbo Core and lock the chip in at 4.6 GHz, I should still see a reasonable speed-up." But a Prime95-induced load quickly demonstrated instability as the FX-8350 shot up over 80 degrees.

It seems as though I had underestimated the FX's ability to generate copious heat, and failed to budget enough for cooling. Even at the stock 1.35 V setting, and with the clock rate dialed in to the processor's peak Turbo Core frequency of 4.3 GHz, Prime95 caused the chip to falter. Simply nudging clock rate, without touching the voltage, results in a significant temperature increase. For example, operating at 4 GHz yields a maximum 60-degree reading, but 4.2 GHz sees that number jump to 70 degrees. Interestingly, I didn't see any throttling, as Chris did when his sample crested 70 degrees. Here's the thing, though: while his Tj. Max was reported as 70 degrees, the retail processors are capped at 90, though the chip is clearly unstable well before it gets that hot.

The best I could achieve with this build's heat sink was 4.33 GHz, forced by dropping the voltage to 1.3375 V, turning off Turbo Core, and increasing the multiplier. Prime95 didn't crash, and the temperature stayed under 75 degrees. We're hesitant to call this a bad sample when the cooler is seemingly barely adequate. Should we choose an FX in the future, we'll need to cut back elsewhere on our budget to leave more room for a higher-end air or closed-loop liquid solution.


Anandtech review

Quote:
AMD's FX architecture was designed for very high clock speeds. With Piledriver we're able to see some of that expressed in overclocking headroom. All of these chips should be good for close to 5GHz depending on your luck of the draw and cooling. For all of these overclocking tests I used AMD's branded closed loop liquid cooler which debuted back with the original FX launch. I didn't have enough time to go through every chip so I picked the FX-8350 and FX-4300 to show the range of overclocks that may be possible. In my case the FX-4300 hit 5GHz with minimal effort, while the FX-8350 topped out at 4.8GHz (I could hit 5GHz but it wasn't stable through all of our tests). Both of these overclocks were achieved with no more than 10% additional core voltage and by simple multiplier adjustments (hooray for unlocked everything). The increase in performance is substantial:


Tech report

Quote:
When you're overclocking a CPU that starts out at 125W, you're gonna need some decent cooling. AMD recommends the big-ass FX water cooler we used to overlocked the FX-8150, but being incredibly lazy, I figured the Thermaltake Frio OCK pictured above, which was already mounted on the CPU, ought to suffice. After all, the radiator is just as large as the water cooler's, and the thing is rated to dissipate up to 240W. Also, I swear to you, there is plenty of room—more than an inch of clearance—between the CPU fan and the video card, even though it doesn't look like it in the picture above. Turns out the Frio OCK kept CPU temperatures in the mid 50° C range, even at full tilt, so I think it did its job well enough.

Trouble is, I didn't quite get the results I'd hoped. As usual, I logged my attempts at various settings as I went, and I've reproduced my notes below. I tested stability using a multithreaded Prime95 torture test. Notice that I took a very simple approach, only raising the voltage for the CPU itself, not for the VRMs or anything else. Perhaps that was the reason my attempts went like so:

4.8GHz, 1.475V - reboot
4.7GHz, 1.4875V - lock
4.6GHz, 1.525V - errors on multiple threads
4.6GHz, 1.5375V - errors with temps ~55C
4.6GHZ, 1.5375V, Turbo fan - stable with temps ~53.5C, eventually locked
4.6GHZ, 1.5375V, manual fan, 100% duty cycle at 50C - lock
4.6GHZ, 1.55V, manual fan, 100% duty cycle at 50C - crashes, temps ~54.6C
4.4GHz, 1.55V - ok
4.5GHz, 1.55V - ok, ~57C, 305W
4.5GHz, 1.475V - errors
4.5GHz, 1.525V - errors
4.5GHz, 1.5375V - OK, ~56C
At the end of the process, I could only squeeze an additional 500MHz out of the FX-8350 at 1.5375V, one notch down from the max voltage exposed in the Overdrive utility. AMD told reviewers to expect something closer to 5GHz, so apparently either I've failed or this particular chip just isn't very cooperative.

I disabled Turbo Core for my initial overclocking attempts, but once I'd established a solid base clock, I was able to grab a little more speed by creating a Turbo Core profile that ranged up to 4.8GHz at 1.55V. Here's how a pair of our benchmarks ran on the overclocked FX-8350.


From openbenchmarking

Quote:
In the end, we were able to take the FX-8350 up to a stable 4.7GHz. Unfortunately, due to time constraints and an incompatibility with AMD OverDrive and our test-bed’s motherboard, we don’t have accurate temperature data to share at this point. But considering how easy it was to take our CPU to 4.7GHz, we suspect that higher clocks will easily be possible with more exotic cooling and more aggressive voltage tweaking.


Under air, clocks top out at about 4.8 ghz on a good chip for the 8350 and 4.4-4.5 for the 3770k. The % overclocks are similar.

Intel's stupid decision was probably decided by the marketing/accounting team to maximize profits. Or possibly the engineers were told to bring the costs down to $x per cpu and cutting the heat transfer material was the cheapest and easiest way to do that (considering as a percent of the market, few people overclock). Anyway, to say the ivy bridge "architecture" is poor is incorrect, rather the ivy bridge "implementation" is poor.

Edit:
"You cannot pretend to select a few biased tests {*} showing a 15% difference and say that the "fx8350 lagged behind", whereas say that the i7 was "very, very close" on tests where the FX was 30-70% faster."

What biased tests? The FX isn't 30-70% faster on two of the three tests, it was pretty much margin of error (<5%). The other test the FX was significantly faster because for some unusual reason hyperthreading wasn't being used (which is unusual but a fair victory to the 8350). I'm saying that we must look at all the tests. You are essentially showing me three tests from a sample where the FX is basically tied or beating the i7-3770k. What about other tests in that review? Where is the link?
April 8, 2013 7:42:18 PM

whyso said:

Anyone running a professional application is going to be leery of overclocking (not semi-professional but real professional). Gamers will overclock but people working and running NAS or database searches are more than not going to run the chip at stock (and if they are going to be running the chip at full load then power consumption is going to come into play). At stock we are using 75 watts more than the i7 and getting similar performance (within 5%). If that is running 24/7 then assuming electricity is 11 cents a kilowatt (0.075*24*365*0.11 =$72 a year in additional electricity consumption from the cpu alone--ignoring additional losses from the power supply(efficiency is a percentage of output) and additional costs to cool the case--AC). Now look at the overclocked power usage (generally going to be about 10% faster than the i7) and suppose power is twice the price (as it is in many countries). I use this argument to illustrate why anyone who is going to run professional applications 24/7 probably woudn't overclock and are concerned about cost over the lifetime of the machine. For the casual professional (not someone who has a rendering project on 24/7) the 8350 is a good deal and buy but if power consumption is coming into play then things change.





"Do you not still understand that people who buy FX chips use them for both work and gaming. Did you see the benchmarks I gave to you? What do you believe NAS, C-ray, NAS, BZIP, 3DMAX, database search, JTR... benchmarks are for?"

They are going to care about extractable performance.


Intel is a penny pincher who likes to nickle and dime their buyers with motherboard changes every couple years.

I said multiple times FX overclocks well. To put it in perspective what clocks is the average air-cooling enthusiast going to use. That is what matters the most. No one is getting 5.2 ghz with ease. 8350 tops out at 4.8 with an air cooler.

They can probably get about 4.7-4.8 on a good chip under air (from stock 4.0 (turbo 4.1)) that is an overclock of 17% (4.8/4.1). The 3770k can get around 4.4-4.5 ghz on a good chip (from a 3.8 turbo (regular 3.5)) on four cores. That is an overclock of 16-18%, which is roughly the same. The FX can hit higher clocks but the gain that both chips gain from overclocking is roughly the same (FX does not scale as well with frequency because the cache runs at a constant speed).


They got their i7 chip to 4.9 ghz (obviously an outlier).
http://techreport.com/review/22833/ivy-bridge-on-air-th...

Toms hardware review (bolded is mine)

Quote:
Using a 1.375 V CPU voltage and a 1.175 V northbridge voltage, I was able to get FX-8350 running stably at 4.8 GHz under full load. In the screen capture above, I'm running a single-threaded test to spin the chip up, but the highlighted maximum temperature is where our benchmark suite peaked.


Quote:
The FX-8350 wanted to go even faster, but the key here is a voltage setting low enough that you avoid hitting 70 degrees Celsius. At that point, the thermal monitor starts cycling cores to throttle down (evidenced in the image above), keeping the chip from getting any hotter and negatively impacting performance. So long as I didn’t trigger any threaded workloads, I was even able to run benchmarks as high as 5.125 GHz (requiring a 1.4375 V CPU voltage and 1.2 V northbridge setting).


Their system builder (not all chips are created equal --FX requires good cooling).

Quote:
This is the first time anyone at Tom's Hardware has tried his hand at overclocking a retail FX-8350. And, after reading Chris' experience taking his sample from AMD up above 5 GHz, I was looking forward to something similar. It turns out that I was being far too ambitious, though. Xigmatek's Loki doesn't have the headroom to keep the 125 W processor cool beyond its stock clock rates. Beyond performance, thermals are probably AMD's biggest disadvantage in this comparison. We really would need to spend a lot more on cooling to achieve any sort of meaningful overclock.

Regardless of the processor or northbridge voltages we used, we couldn't exceed 4.63 GHz. "Fair enough," I first thought. "If I disable Turbo Core and lock the chip in at 4.6 GHz, I should still see a reasonable speed-up." But a Prime95-induced load quickly demonstrated instability as the FX-8350 shot up over 80 degrees.

It seems as though I had underestimated the FX's ability to generate copious heat, and failed to budget enough for cooling. Even at the stock 1.35 V setting, and with the clock rate dialed in to the processor's peak Turbo Core frequency of 4.3 GHz, Prime95 caused the chip to falter. Simply nudging clock rate, without touching the voltage, results in a significant temperature increase. For example, operating at 4 GHz yields a maximum 60-degree reading, but 4.2 GHz sees that number jump to 70 degrees. Interestingly, I didn't see any throttling, as Chris did when his sample crested 70 degrees. Here's the thing, though: while his Tj. Max was reported as 70 degrees, the retail processors are capped at 90, though the chip is clearly unstable well before it gets that hot.

The best I could achieve with this build's heat sink was 4.33 GHz, forced by dropping the voltage to 1.3375 V, turning off Turbo Core, and increasing the multiplier. Prime95 didn't crash, and the temperature stayed under 75 degrees. We're hesitant to call this a bad sample when the cooler is seemingly barely adequate. Should we choose an FX in the future, we'll need to cut back elsewhere on our budget to leave more room for a higher-end air or closed-loop liquid solution.


Anandtech review

Quote:
AMD's FX architecture was designed for very high clock speeds. With Piledriver we're able to see some of that expressed in overclocking headroom. All of these chips should be good for close to 5GHz depending on your luck of the draw and cooling. For all of these overclocking tests I used AMD's branded closed loop liquid cooler which debuted back with the original FX launch. I didn't have enough time to go through every chip so I picked the FX-8350 and FX-4300 to show the range of overclocks that may be possible. In my case the FX-4300 hit 5GHz with minimal effort, while the FX-8350 topped out at 4.8GHz (I could hit 5GHz but it wasn't stable through all of our tests). Both of these overclocks were achieved with no more than 10% additional core voltage and by simple multiplier adjustments (hooray for unlocked everything). The increase in performance is substantial:


Tech report

Quote:
When you're overclocking a CPU that starts out at 125W, you're gonna need some decent cooling. AMD recommends the big-ass FX water cooler we used to overlocked the FX-8150, but being incredibly lazy, I figured the Thermaltake Frio OCK pictured above, which was already mounted on the CPU, ought to suffice. After all, the radiator is just as large as the water cooler's, and the thing is rated to dissipate up to 240W. Also, I swear to you, there is plenty of room—more than an inch of clearance—between the CPU fan and the video card, even though it doesn't look like it in the picture above. Turns out the Frio OCK kept CPU temperatures in the mid 50° C range, even at full tilt, so I think it did its job well enough.

Trouble is, I didn't quite get the results I'd hoped. As usual, I logged my attempts at various settings as I went, and I've reproduced my notes below. I tested stability using a multithreaded Prime95 torture test. Notice that I took a very simple approach, only raising the voltage for the CPU itself, not for the VRMs or anything else. Perhaps that was the reason my attempts went like so:

4.8GHz, 1.475V - reboot
4.7GHz, 1.4875V - lock
4.6GHz, 1.525V - errors on multiple threads
4.6GHz, 1.5375V - errors with temps ~55C
4.6GHZ, 1.5375V, Turbo fan - stable with temps ~53.5C, eventually locked
4.6GHZ, 1.5375V, manual fan, 100% duty cycle at 50C - lock
4.6GHZ, 1.55V, manual fan, 100% duty cycle at 50C - crashes, temps ~54.6C
4.4GHz, 1.55V - ok
4.5GHz, 1.55V - ok, ~57C, 305W
4.5GHz, 1.475V - errors
4.5GHz, 1.525V - errors
4.5GHz, 1.5375V - OK, ~56C
At the end of the process, I could only squeeze an additional 500MHz out of the FX-8350 at 1.5375V, one notch down from the max voltage exposed in the Overdrive utility. AMD told reviewers to expect something closer to 5GHz, so apparently either I've failed or this particular chip just isn't very cooperative.

I disabled Turbo Core for my initial overclocking attempts, but once I'd established a solid base clock, I was able to grab a little more speed by creating a Turbo Core profile that ranged up to 4.8GHz at 1.55V. Here's how a pair of our benchmarks ran on the overclocked FX-8350.


From openbenchmarking

Quote:
In the end, we were able to take the FX-8350 up to a stable 4.7GHz. Unfortunately, due to time constraints and an incompatibility with AMD OverDrive and our test-bed’s motherboard, we don’t have accurate temperature data to share at this point. But considering how easy it was to take our CPU to 4.7GHz, we suspect that higher clocks will easily be possible with more exotic cooling and more aggressive voltage tweaking.


Under air, clocks top out at about 4.8 ghz on a good chip for the 8350 and 4.4-4.5 for the 3770k. The % overclocks are similar.

Intel's stupid decision was probably decided by the marketing/accounting team to maximize profits. Or possibly the engineers were told to bring the costs down to $x per cpu and cutting the heat transfer material was the cheapest and easiest way to do that (considering as a percent of the market, few people overclock). Anyway, to say the ivy bridge "architecture" is poor is incorrect, rather the ivy bridge "implementation" is poor.

Edit:
"You cannot pretend to select a few biased tests {*} showing a 15% difference and say that the "fx8350 lagged behind", whereas say that the i7 was "very, very close" on tests where the FX was 30-70% faster."

What biased tests? The FX isn't 30-70% faster on two of the three tests, it was pretty much margin of error (<5%). The other test the FX was significantly faster because for some unusual reason hyperthreading wasn't being used (which is unusual but a fair victory to the 8350). I'm saying that we must look at all the tests. You are essentially showing me three tests from a sample where the FX is basically tied or beating the i7-3770k. What about other tests in that review? Where is the link?


I wonder if anyone of this has some resemblance to what was being discussed.

Who said you that people working and running NAS or database searches will be overclocking?

You post another biased review from ALS. Once again he is not comparing like with like. Moreover, I notice how he puts graphics baselines at 50W instead 0W giving a false impression that the difference between ivy and piledriver is greater than it is really. His trick is well-known; google "misleading graphics".

The FX consumes more at full, but finishes the work before and falls to lower consumptions than the intel. The point is that the difference in total system power consumption for Vishera was of less than 20W more. And desktop computers are 90% of time at idle or at low loads, which mean that in a years basis the cost is tiny: from cents to few dollars.

This was already remarked before and by several posters but good myths don't die.

I also find interesting the excessively high consumptions that those guys measured. No need to mention again why one waits bizarre values from ALS, but the other review is not giving info enough to know what they measured and how.

I can assure you that most of the overclockers are not using air. I think a list of overclockers and their settings was given before. Only a minority used air.

In any case I am not impressed with the i7 overclocked on air using an aftermarker EXPENSIVE cooler. It is evident that was golden chip sent to the review site. In the list of overclockers given before you can see people running FX up to 5.1GHz with the STOCK COOLER.

Who said you that marketing depts. take decisions about profits?

Who said you that "design" = "architecture"?
a b à CPUs
April 9, 2013 7:17:08 AM

juanrga said:
whyso said:

Anyone running a professional application is going to be leery of overclocking (not semi-professional but real professional). Gamers will overclock but people working and running NAS or database searches are more than not going to run the chip at stock (and if they are going to be running the chip at full load then power consumption is going to come into play). At stock we are using 75 watts more than the i7 and getting similar performance (within 5%). If that is running 24/7 then assuming electricity is 11 cents a kilowatt (0.075*24*365*0.11 =$72 a year in additional electricity consumption from the cpu alone--ignoring additional losses from the power supply(efficiency is a percentage of output) and additional costs to cool the case--AC). Now look at the overclocked power usage (generally going to be about 10% faster than the i7) and suppose power is twice the price (as it is in many countries). I use this argument to illustrate why anyone who is going to run professional applications 24/7 probably woudn't overclock and are concerned about cost over the lifetime of the machine. For the casual professional (not someone who has a rendering project on 24/7) the 8350 is a good deal and buy but if power consumption is coming into play then things change.





"Do you not still understand that people who buy FX chips use them for both work and gaming. Did you see the benchmarks I gave to you? What do you believe NAS, C-ray, NAS, BZIP, 3DMAX, database search, JTR... benchmarks are for?"

They are going to care about extractable performance.


Intel is a penny pincher who likes to nickle and dime their buyers with motherboard changes every couple years.

I said multiple times FX overclocks well. To put it in perspective what clocks is the average air-cooling enthusiast going to use. That is what matters the most. No one is getting 5.2 ghz with ease. 8350 tops out at 4.8 with an air cooler.

They can probably get about 4.7-4.8 on a good chip under air (from stock 4.0 (turbo 4.1)) that is an overclock of 17% (4.8/4.1). The 3770k can get around 4.4-4.5 ghz on a good chip (from a 3.8 turbo (regular 3.5)) on four cores. That is an overclock of 16-18%, which is roughly the same. The FX can hit higher clocks but the gain that both chips gain from overclocking is roughly the same (FX does not scale as well with frequency because the cache runs at a constant speed).


They got their i7 chip to 4.9 ghz (obviously an outlier).
http://techreport.com/review/22833/ivy-bridge-on-air-th...

Toms hardware review (bolded is mine)

Quote:
Using a 1.375 V CPU voltage and a 1.175 V northbridge voltage, I was able to get FX-8350 running stably at 4.8 GHz under full load. In the screen capture above, I'm running a single-threaded test to spin the chip up, but the highlighted maximum temperature is where our benchmark suite peaked.


Quote:
The FX-8350 wanted to go even faster, but the key here is a voltage setting low enough that you avoid hitting 70 degrees Celsius. At that point, the thermal monitor starts cycling cores to throttle down (evidenced in the image above), keeping the chip from getting any hotter and negatively impacting performance. So long as I didn’t trigger any threaded workloads, I was even able to run benchmarks as high as 5.125 GHz (requiring a 1.4375 V CPU voltage and 1.2 V northbridge setting).


Their system builder (not all chips are created equal --FX requires good cooling).

Quote:
This is the first time anyone at Tom's Hardware has tried his hand at overclocking a retail FX-8350. And, after reading Chris' experience taking his sample from AMD up above 5 GHz, I was looking forward to something similar. It turns out that I was being far too ambitious, though. Xigmatek's Loki doesn't have the headroom to keep the 125 W processor cool beyond its stock clock rates. Beyond performance, thermals are probably AMD's biggest disadvantage in this comparison. We really would need to spend a lot more on cooling to achieve any sort of meaningful overclock.

Regardless of the processor or northbridge voltages we used, we couldn't exceed 4.63 GHz. "Fair enough," I first thought. "If I disable Turbo Core and lock the chip in at 4.6 GHz, I should still see a reasonable speed-up." But a Prime95-induced load quickly demonstrated instability as the FX-8350 shot up over 80 degrees.

It seems as though I had underestimated the FX's ability to generate copious heat, and failed to budget enough for cooling. Even at the stock 1.35 V setting, and with the clock rate dialed in to the processor's peak Turbo Core frequency of 4.3 GHz, Prime95 caused the chip to falter. Simply nudging clock rate, without touching the voltage, results in a significant temperature increase. For example, operating at 4 GHz yields a maximum 60-degree reading, but 4.2 GHz sees that number jump to 70 degrees. Interestingly, I didn't see any throttling, as Chris did when his sample crested 70 degrees. Here's the thing, though: while his Tj. Max was reported as 70 degrees, the retail processors are capped at 90, though the chip is clearly unstable well before it gets that hot.

The best I could achieve with this build's heat sink was 4.33 GHz, forced by dropping the voltage to 1.3375 V, turning off Turbo Core, and increasing the multiplier. Prime95 didn't crash, and the temperature stayed under 75 degrees. We're hesitant to call this a bad sample when the cooler is seemingly barely adequate. Should we choose an FX in the future, we'll need to cut back elsewhere on our budget to leave more room for a higher-end air or closed-loop liquid solution.


Anandtech review

Quote:
AMD's FX architecture was designed for very high clock speeds. With Piledriver we're able to see some of that expressed in overclocking headroom. All of these chips should be good for close to 5GHz depending on your luck of the draw and cooling. For all of these overclocking tests I used AMD's branded closed loop liquid cooler which debuted back with the original FX launch. I didn't have enough time to go through every chip so I picked the FX-8350 and FX-4300 to show the range of overclocks that may be possible. In my case the FX-4300 hit 5GHz with minimal effort, while the FX-8350 topped out at 4.8GHz (I could hit 5GHz but it wasn't stable through all of our tests). Both of these overclocks were achieved with no more than 10% additional core voltage and by simple multiplier adjustments (hooray for unlocked everything). The increase in performance is substantial:


Tech report

Quote:
When you're overclocking a CPU that starts out at 125W, you're gonna need some decent cooling. AMD recommends the big-ass FX water cooler we used to overlocked the FX-8150, but being incredibly lazy, I figured the Thermaltake Frio OCK pictured above, which was already mounted on the CPU, ought to suffice. After all, the radiator is just as large as the water cooler's, and the thing is rated to dissipate up to 240W. Also, I swear to you, there is plenty of room—more than an inch of clearance—between the CPU fan and the video card, even though it doesn't look like it in the picture above. Turns out the Frio OCK kept CPU temperatures in the mid 50° C range, even at full tilt, so I think it did its job well enough.

Trouble is, I didn't quite get the results I'd hoped. As usual, I logged my attempts at various settings as I went, and I've reproduced my notes below. I tested stability using a multithreaded Prime95 torture test. Notice that I took a very simple approach, only raising the voltage for the CPU itself, not for the VRMs or anything else. Perhaps that was the reason my attempts went like so:

4.8GHz, 1.475V - reboot
4.7GHz, 1.4875V - lock
4.6GHz, 1.525V - errors on multiple threads
4.6GHz, 1.5375V - errors with temps ~55C
4.6GHZ, 1.5375V, Turbo fan - stable with temps ~53.5C, eventually locked
4.6GHZ, 1.5375V, manual fan, 100% duty cycle at 50C - lock
4.6GHZ, 1.55V, manual fan, 100% duty cycle at 50C - crashes, temps ~54.6C
4.4GHz, 1.55V - ok
4.5GHz, 1.55V - ok, ~57C, 305W
4.5GHz, 1.475V - errors
4.5GHz, 1.525V - errors
4.5GHz, 1.5375V - OK, ~56C
At the end of the process, I could only squeeze an additional 500MHz out of the FX-8350 at 1.5375V, one notch down from the max voltage exposed in the Overdrive utility. AMD told reviewers to expect something closer to 5GHz, so apparently either I've failed or this particular chip just isn't very cooperative.

I disabled Turbo Core for my initial overclocking attempts, but once I'd established a solid base clock, I was able to grab a little more speed by creating a Turbo Core profile that ranged up to 4.8GHz at 1.55V. Here's how a pair of our benchmarks ran on the overclocked FX-8350.


From openbenchmarking

Quote:
In the end, we were able to take the FX-8350 up to a stable 4.7GHz. Unfortunately, due to time constraints and an incompatibility with AMD OverDrive and our test-bed’s motherboard, we don’t have accurate temperature data to share at this point. But considering how easy it was to take our CPU to 4.7GHz, we suspect that higher clocks will easily be possible with more exotic cooling and more aggressive voltage tweaking.


Under air, clocks top out at about 4.8 ghz on a good chip for the 8350 and 4.4-4.5 for the 3770k. The % overclocks are similar.

Intel's stupid decision was probably decided by the marketing/accounting team to maximize profits. Or possibly the engineers were told to bring the costs down to $x per cpu and cutting the heat transfer material was the cheapest and easiest way to do that (considering as a percent of the market, few people overclock). Anyway, to say the ivy bridge "architecture" is poor is incorrect, rather the ivy bridge "implementation" is poor.

Edit:
"You cannot pretend to select a few biased tests {*} showing a 15% difference and say that the "fx8350 lagged behind", whereas say that the i7 was "very, very close" on tests where the FX was 30-70% faster."

What biased tests? The FX isn't 30-70% faster on two of the three tests, it was pretty much margin of error (<5%). The other test the FX was significantly faster because for some unusual reason hyperthreading wasn't being used (which is unusual but a fair victory to the 8350). I'm saying that we must look at all the tests. You are essentially showing me three tests from a sample where the FX is basically tied or beating the i7-3770k. What about other tests in that review? Where is the link?


I wonder if anyone of this has some resemblance to what was being discussed.

Who said you that people working and running NAS or database searches will be overclocking?

You post another biased review from ALS. Once again he is not comparing like with like. Moreover, I notice how he puts graphics baselines at 50W instead 0W giving a false impression that the difference between ivy and piledriver is greater than it is really. His trick is well-known; google "misleading graphics".

The FX consumes more at full, but finishes the work before and falls to lower consumptions than the intel. The point is that the difference in total system power consumption for Vishera was of less than 20W more. And desktop computers are 90% of time at idle or at low loads, which mean that in a years basis the cost is tiny: from cents to few dollars.

This was already remarked before and by several posters but good myths don't die.

I also find interesting the excessively high consumptions that those guys measured. No need to mention again why one waits bizarre values from ALS, but the other review is not giving info enough to know what they measured and how.

I can assure you that most of the overclockers are not using air. I think a list of overclockers and their settings was given before. Only a minority used air.

In any case I am not impressed with the i7 overclocked on air using an aftermarker EXPENSIVE cooler. It is evident that was golden chip sent to the review site. In the list of overclockers given before you can see people running FX up to 5.1GHz with the STOCK COOLER.

Who said you that marketing depts. take decisions about profits?

Who said you that "design" = "architecture"?


Im not sure who ALS is?

In the x.264 anandtech test the two finished in roughly the same time (pass 1 intel was ahead, pass two amd was ahead). The 8350 used far more power than the i7. Look at the task energy from tech report.

I mean overclockers on average. How many people in this forum overclock on air vs water? Most of the threads about building a computer are using air, rarely do you see water.

you are not going to get 5.1 ghz on a 8350 on the stock cooler.

I don't know where you are seeing 50 watts as baseline.

Those "excessively" high power consumptions are consistent throughout pretty much all the reviews I can find of the 8350 on the internet.

Marketing (I most mean the people in charge of deciding how to sell the chip and maximize profit--i guess this is finance as well) make many many decisions about how the product is going to be sold to maximize profits.

I don't think this thread is going anywhere so I'm going to stop.
April 9, 2013 10:33:07 AM

whyso said:

Those "excessively" high power consumptions are consistent throughout pretty much all the reviews I can find of the 8350 on the internet.


IDLE delta in Watts (FX-8350 vs i7-3770k)
==========================

Anandtech: 14.7
Tech Report: 22 (Abnormally high)
CPU Boss: 17
Toms: 16
Legit: 11
Xbits: 0
Bit Tech (*): -4

Variation found in reviews: 22W - (-4W) = 24W


LOAD delta in Watts (FX-8350 vs i7-3770k)
==========================

Anandtech: 75.4
Tech Report: 96 (Abnormally high)
CPU Boss: 54
Toms: 88
Legit: 56
Xbits: 87
Bit Tech (*): 47

Variation found in reviews: 96W - 47W = 49W


TYPICAL delta in Watts (FX-8350 vs i7-3770k)
============================

CPU Boss: 45

(*) Note: "the AMD chips were tested in an ATX motherboard, while the Intel LGA1155 chips were tested in a micro-ATX board. This difference can account for up to 20W, as we found in our Energy Efficient Hardware feature."

As shown, depending of the hardware, measuring methodology, and specific task you can find up to 50W difference on claimed power consumption figures.

Moreover, how ALS writes in his review:

Quote:
Note that idle power consumption can be competitive, but will obviously vary depending on the motherboard used (the Crosshair Formula V is hardly the lowest power AM3+ board available)


In fact they chose one of the highest power boards possible for AMD:



And if you take a look to the testing methodology of the tech report with abnormally high figures. Not only they chose the power hungry Crosshair for the AMD, but a specific MSI motherboard for the Intel i5/i7, which curiously is the lowest power board



Therefore take this also into consideration when reading their power consumptions graphs...
April 9, 2013 3:03:48 PM

pankakes123 said:
juanrga said:
Moreover, at the time of writing this the FX is clearly faster (even beating the i7) under linux {*} and future games will be optimized for the AMD because just-ready next consoles will be using AMD chips.


Yep, I think this is important. The PS4 is running on piledriver architecture and radeon graphics, AMD could easily pull ahead in gaming in the future because games well be optimized for their hardware.


Please keep in mind how the PS4s games are going to be coded and how the actual engines and textures will be.
April 9, 2013 3:13:41 PM

I'm just going to say this. Both processors have their differences. One is better for video editing (AMD), and one is better for more gaming (Intel). One is better in physical build and coding (Intel), and one is almost as good, but just doesn't break the bulb (AMD). One has less power consumption (Intel) and one has more (AMD).

I'm not bad mouthing any specific one. I am going by my research and my overall and final point on these two. The obvious better buy if they are the same price is Intel, but thus Intel's physical build quality, code, and power consumption, are what bring up the price (and also what gives is the lead on AMD). I would really only choose AMD if I wasnt a big gamer, I had a couple of games, and if I do more video (GFX) editing and such.

However, since I'm a huge gamer and I want the best for my build, I'm spending that extra $100 or so to get the i7-3770K which is what benefits me and my build (Don't forget the kick-ass fan that comes with it). You can go for an i5-3570K for about the same price as an AMD, but get that better physical build and coding (even though there are a couple capabilities it doesn't have compared to the 8350 or the 3770K).

It's really what you go for and want to do. Overall Intel is better with HARDCORE Gaming, and AMD is good with it, but has a better advantage in the video and GFX department.

If you would like to go against what I am saying, I wouldn't mind replying. ^_^


EDIT: I forgot to mention. This is going off of if you have enough money for things like this. I don't criticize if you get an AMD because of your budget, that's fine. AMD is good for it even if you get one of the not so high end processors. But I'm mostly on the side of, if you go for AMD with all your life and get everything from them and think they're ever so amazing, I will eventually start to criticize (no offence).

For example: I may eventually build a another small pc that is cheap that I can do video editing with, so thus, I could use it to record my Gaming and then edit or whatever it to my needs. Then I would get an AMD so it fufills that purpose.
a c 210 à CPUs
April 9, 2013 4:00:05 PM

Your assumptions are all false:

1.) Intel build quality is poorer, their chips are not SOI...the reason they cost more is because of onboard graphics that no one uses.

2.) The difference in power consumption over the course of a year is equivalent to turning on an additional 40W light bulb in your home.

3.) The Gaming myth has been debunked already...games like Crysis 3, Planetside 2, Bioshock Infinite, Metro 2033, Tomb Raider, and others are all within margin for error difference between the i5-3570k and the FX 8350, and the FX 8350 even beats the i7-3770k in some games. Skyrim is the only outlier, so don't bother to cite it as an example...

4.) For the extra $130 difference between the FX8350 and i7-3770k I can buy a H100i cooling system, and still come out cheaper than the i7-3770k and it's better stock cooler.

5.) If Intel had superior build quality why does AMD hold EVERY world record for overclocking, where build quality really comes directly into play? They hold them by 1+ GHz by the way (highest record is 8.76 GHz, where intel is 7.18 GHz), not some trivial margin of 100 MHz or something like that. AMD has the world record for highest overclock with all cores active by the way as well (8 cores on the FX8350 @ 8.176 GHz)
a b à CPUs
April 9, 2013 4:27:49 PM

8350rocks said:
Your assumptions are all false:

1.) Intel build quality is poorer, their chips are not SOI...the reason they cost more is because of onboard graphics that no one uses.

2.) The difference in power consumption over the course of a year is equivalent to turning on an additional 40W light bulb in your home.

3.) The Gaming myth has been debunked already...games like Crysis 3, Planetside 2, Bioshock Infinite, Metro 2033, Tomb Raider, and others are all within margin for error difference between the i5-3570k and the FX 8350, and the FX 8350 even beats the i7-3770k in some games. Skyrim is the only outlier, so don't bother to cite it as an example...

4.) For the extra $130 difference between the FX8350 and i7-3770k I can buy a H100i cooling system, and still come out cheaper than the i7-3770k and it's better stock cooler.

5.) If Intel had superior build quality why does AMD hold EVERY world record for overclocking, where build quality really comes directly into play? They hold them by 1+ GHz by the way (highest record is 8.76 GHz, where intel is 7.18 GHz), not some trivial margin of 100 MHz or something like that. AMD has the world record for highest overclock with all cores active by the way as well (8 cores on the FX8350 @ 8.176 GHz)


1. Onboard graphics can be used to speed up certain tasks (OpenCL acceleration). To say the quality is poorer is wrong. They are using 22nm trigate.

2. Very true. But if you are running a system 24/7 rendering/encoding, it will add up.

3. Yep, with the exception of older games, newer games will use four threads are be pretty much equal across the line. Future games should show no bias.

4. Yep, though overclocked 8350 uses a shitton of power. You will probably need a better PSU.

5. I don't think intel cares about this because how many people and what portion of their market run their computers at that speed. Intel made a decision to prioritize power consumption over frequency (rightly so) because more people care about power consumption vs some overclock that they don't care about. This is why intel is kicking amd's butt in mobile and amd has nothing that can touch an i7 quad in a laptop.

Please don't ever use cpu-boss. That site is as pathetic as crap. Look at their singlethread performance ranking.

http://cpuboss.com/cpus/Desktop-CPUs-best-Single-Thread...

1. 3570S
7. Pentium G2130
11. 3770k

Does that make any sense? The site is horribly coded as well.

Good point about the motherboards.
a c 210 à CPUs
April 9, 2013 4:48:53 PM

whyso said:


1. Onboard graphics can be used to speed up certain tasks (OpenCL acceleration). To say the quality is poorer is wrong. They are using 22nm trigate.


5. I don't think intel cares about this because how many people and what portion of their market run their computers at that speed. Intel made a decision to prioritize power consumption over frequency (rightly so) because more people care about power consumption vs some overclock that they don't care about. This is why intel is kicking amd's butt in mobile and amd has nothing that can touch an i7 quad in a laptop.



Onboard graphics cannot be used in conjunction with CPU when a discrete card is present. AMD has the same process on their iGPU (with the exception of a crossfire setup, though intel offers no such option). Yes their triple gate technology is interesting, but the quality of the wafer they use is only bulk. Additionally even intel has recently conceded that their triple gate process is running out of room, and they will have to convert to SOI soon to shrink their process any smaller. They would have with Haswell if the cost to switch hadn't been so high to begin with.

I don't disagree that intel doesn't care about that segment or they clearly would have paid more attention to it. As far as mobile goes, in the laptop segment AMD is making headway again...(though only creepingly increasing market share by fractions of a %). I expect them to make a bigger splash there with the introduction of Kaveri architecture though.

Also, for smaller devices, their micro architecture is doing quite well, and I expect big things from temash in the handheld/tablet market. I think they do as well.
!