Sign in with
Sign up | Sign in
Your question

AMD FX Single Thread Performance Boost?

Tags:
  • CPUs
  • Performance
  • Processors
Last response: in CPUs
Share
October 22, 2012 1:03:00 AM

Hey I was wondering if someone wanted to try an experiment for me.
I'm thinking about upgrading my CPU to an FX processor (once the Pile Driver ones come out) but I want to know something about them first.
I heard that if you disable one core on each "module" you can increase the single threaded performance significantly (because Windows doesn't know how to send the tasks to the CPUs).
I also heard that because the heat output is reduced by the disabled cores, the processor can be overclocked significantly higher.
It's rumored that you can get a 20-30% gaming increase from doing this.
If any of you guys have an FX processor, do you feel like doing an experiment? This is something that seems really interesting, especially since it could put these CPUs in competition with Intel.
I haven't found any results online that confirm or denies these rumors. Anyone want to try it out?

More about : amd single thread performance boost

a c 283 à CPUs
October 22, 2012 1:46:04 AM

I forget exactly who it was, but I think someone here has already done something similar to that (testing and benchmarking a 8150 with 4 cores disabled).

In any case, it does help, but not exactly because of Windows' inability to correctly utilize the cores. That's part of it, but the biggest reason is that disabling 4 cores effectively turns it into a true quad core, instead of being a quasi-8 core.

That way, a single core per module is able to exclusively use the single execution core and cache without having to share it with the other integer core. Much more efficient that way.

That's also why the module design is flawed. Two integer cores in each module sharing the resources that should be dedicated to a single core is just very inefficient.
m
0
l
a c 186 à CPUs
October 22, 2012 1:47:50 AM

Then it leads me to think: "Why the hell would you buy an 8 core CPU and turn off 4 cores."

What a waste.....
m
0
l
Related resources
a c 152 à CPUs
October 22, 2012 4:51:40 AM

Sure you can and it will help a little but it seems pointless. I don't see the point of buying an "8 core" CPU just to disable 4 cores it's a waste.
m
0
l
a b à CPUs
October 22, 2012 6:45:28 AM

kajunchicken said:
Hey I was wondering if someone wanted to try an experiment for me.
I'm thinking about upgrading my CPU to an FX processor (once the Pile Driver ones come out) but I want to know something about them first.
I heard that if you disable one core on each "module" you can increase the single threaded performance significantly (because Windows doesn't know how to send the tasks to the CPUs).
I also heard that because the heat output is reduced by the disabled cores, the processor can be overclocked significantly higher.
It's rumored that you can get a 20-30% gaming increase from doing this.
If any of you guys have an FX processor, do you feel like doing an experiment? This is something that seems really interesting, especially since it could put these CPUs in competition with Intel.
I haven't found any results online that confirm or denies these rumors. Anyone want to try it out?


It doesn't work, the compilers don't allow for a single core to function as a single core so resources are still bottlenecked somewhat. The only thing it helps with is overall power usage but nothing stellar.

Wait for Piledriver/Vishera which is more efficient with the same core counts, ~7% IPC increase and around 15% or so in overall performance which is good all things considered.
m
0
l
a b à CPUs
October 22, 2012 6:55:32 AM

amuffin said:
Then it leads me to think: "Why the hell would you buy an 8 core CPU and turn off 4 cores."

What a waste.....



I just think its more of the fact that not many programs are optimized for multiple cores. Realistically, AMD's module system IMO looks better on paper vs intels virtual hyperthreading system, its just the execution that did not come out right. the more cores a program uses, the more likely a FX chip will do better. Sadly for gamers, most games are single or dual core optimized because its a large majority of the budget/low end gamers budget.
m
0
l
a b à CPUs
October 22, 2012 7:04:20 AM

Well most game today was console ported.... :( 
m
0
l
a c 186 à CPUs
October 22, 2012 7:35:12 AM

Too bad the whole industry is going downhill right now.....:( 
m
0
l
October 22, 2012 2:06:17 PM

Yea, the reason to disable the cores is because Windows 7 SUCKS at prioritizing the threads. For example, if you have two threads running, both of them will run on the same module instead of sending one thread to one module and sending another thread to another module. It doesn't make sense but Windows 8 supposedly fixes this problem somewhat. I don't understand what you guys are saying about buying an 8 core. The ONLY time you would see the performance benefit of 4 cores vs. 8 is when you are running synthetic benchmarks. Otherwise, there is no reason to have 8 cores seeing as Battlefield 3, the best multithreaded game, can only utilize 6. It makes more sense to suck as much performance as you can from each core rather than weakening them and having more cores.
m
0
l
October 22, 2012 2:08:50 PM

tried this it gives like 1-2% better single threaded performance and its not worth it when you can oc some to push it more ?
m
0
l
a c 152 à CPUs
October 22, 2012 4:00:31 PM

kajunchicken said:
Yea, the reason to disable the cores is because Windows 7 SUCKS at prioritizing the threads. For example, if you have two threads running, both of them will run on the same module instead of sending one thread to one module and sending another thread to another module. It doesn't make sense but Windows 8 supposedly fixes this problem somewhat. I don't understand what you guys are saying about buying an 8 core. The ONLY time you would see the performance benefit of 4 cores vs. 8 is when you are running synthetic benchmarks. Otherwise, there is no reason to have 8 cores seeing as Battlefield 3, the best multithreaded game, can only utilize 6. It makes more sense to suck as much performance as you can from each core rather than weakening them and having more cores.


Lol gotta love when people blame the software for AMD's trashy performance. The Bulldozers crappy performance has very little to do with Windows the fact is the module micro architecture sucks and is slow and ineffcient compared to Intel. The fact is that the IPC of the Bulldozer is horrible. Also everything you said about Windows 8 fixing the problem is a load of crap. Neither Windows 7 hot fixes or Windows 8 is going to fix Bulldozer's crappy performance because it has nothing to do with the software, it's a hardware problem. The only the Bulldozer and crappy module design is going to be better is when AMD actually fixes the IPC, until then it's going to continue to be way behind Intel.
m
0
l
a c 152 à CPUs
October 22, 2012 4:02:45 PM

gamerkila57 said:
tried this it gives like 1-2% better single threaded performance and its not worth it when you can oc some to push it more ?


Again hot fixes and software isn't going to make up for the HARDWARES crappy design. That's why it's only a 1-2% difference when you use the hotfixes.
m
0
l
a b à CPUs
October 22, 2012 4:30:32 PM

Windows 8 is supposed to have a better scheduler that will improve it's own usage of Bulldozer, I read somewhere.

Well I don't think the design is flawed or crappy. They obviously thought it through, when they made it..

But in terms of how programs are used today, and how OS's are today, you could say that it is very inefficient way of doing it. har

I've heard Linux treats bulldozer much better than windows. But I haven't seen anyone test that either.
m
0
l
a c 152 à CPUs
October 22, 2012 4:42:31 PM

NoUserBar said:
Windows 8 is supposed to have a better scheduler that will improve it's own usage of Bulldozer, I read somewhere.

Well I don't think the design is flawed or crappy. They obviously thought it through, when they made it..

But in terms of how programs are used today, and how OS's are today, you could say that it is very inefficient way of doing it. har

I've heard Linux treats bulldozer much better than windows. But I haven't seen anyone test that either.


Keep dreaming. While it may help a little it won't be a drastic improvement, it will probably only be a very small percent just like the "hot fixes." Again it's a hardware problem not a software problem, and again until AMD fixes the IPC the Bulldozer module architecture will continue to be far behind Intel. BTW what good is the software when the hardware sucks and can use it to it's fullest.
m
0
l
October 23, 2012 12:40:52 AM

rds1220 said:
Keep dreaming. While it may help a little it won't be a drastic improvement, it will probably only be a very small percent just like the "hot fixes." Again it's a hardware problem not a software problem, and again until AMD fixes the IPC the Bulldozer module architecture will continue to be far behind Intel. BTW what good is the software when the hardware sucks and can use it to it's fullest.

I really enjoy when AMD haters ignorantly claim things that they obviously know very little about. There have been benchmarks that have shown that Windows 8 improve ALL multicore processors even Intel's by 5% to 10%. Why this is not major it is still an improvement. Please don't continue to go on forums just to blatantly make false claims and start an argument. It is people like you that ruin a perfectly good conversation and turn it into a preschoolers playground fight. Please learn all of the facts from now on.
m
0
l
a c 152 à CPUs
October 23, 2012 2:34:31 AM

kajunchicken said:
I really enjoy when AMD haters ignorantly claim things that they obviously know very little about. There have been benchmarks that have shown that Windows 8 improve ALL multicore processors even Intel's by 5% to 10%. Why this is not major it is still an improvement. Please don't continue to go on forums just to blatantly make false claims and start an argument. It is people like you that ruin a perfectly good conversation and turn it into a preschoolers playground fight. Please learn all of the facts from now on.


Lol you have no clue. The Windows 7 hotfixes did nothing to improve the performance and is the reason it was withdrawn only a short time after being released. I never said that Windows 8 wouldn't do anything I said it would improve it a little but Windows 8 isn't going to make Bulldozer or Piledriver suddenly a great CPU, it isn't going to magically make AMD CPU's competitive with Intel CPU's. The only way AMD will become competive is when they fix the crappy IPC that cripples Bulldozers performance.

BTW, AMD hater? try realist. I call it like I see it and the fact is AMD CPU's are crap right now.
m
0
l
a b à CPUs
October 23, 2012 2:40:06 AM

rds1220 said:
Lol you have no clue. The Windows 7 hotfixes did nothing to improve the performance and is the reason it was wn only a short time after being released. I never said that it wouldn't do anything I said it would improve it a little but Windows 8 isn't going to make Bulldozer or Piledriver suddenly a great CPU, it isn't going to magically make AMD CPU's competitive with Intel CPU's. The only way AMD will become competive is when when they fix the crappy IPC that cripples Bulldozers performance.

BTW, AMD hater? try realist. I call it like I see it and the fact is AMD CPU's are crap right now.



Your information is correct, however it's not like the Bulldozer is so "crippled" the average end user will notice any real world slow downs lol. I get it though
m
0
l
a c 152 à CPUs
October 23, 2012 2:55:08 AM

rage33 said:
Your information is correct, however it's not like the Bulldozer is so "crippled" the average end user will notice any real world slow downs lol. I get it though


No it's not but people aren't stupid either. I know I number of people who bought Bulldozer CPU's only to get rid of them because of the lack of performance also not to mention that at certain resolutions the Bulldozer can and will bottleneck a high-end GPU. To someone buying a Bulldozer for Word and Excel wouldn't notice a difference but in gaming you can and anyway how many people are buying a BE 8100 for Office and checking emails.
m
0
l
a b à CPUs
October 23, 2012 4:00:41 AM

rds1220 said:
No it's not but people aren't stupid either. I know I number of people who bought Bulldozer CPU's only to get rid of them because of the lack of performance also not to mention that at certain resolutions the Bulldozer can and will bottleneck a high-end GPU. To someone buying a Bulldozer for Word and Excel wouldn't notice a difference but in gaming you can and anyway how many people are buying a BE 8100 for Office and checking emails.



haha fair point. How many gamers are really gaming above 1920x1080? I haven't heard of any resolution issues before, if that's correct that is very unfortunate.
m
0
l
a b à CPUs
October 23, 2012 6:17:32 AM

No need to talk of Zambezi with Vishera out now.
m
0
l
October 23, 2012 10:42:56 AM

Do your friends happen to be running triple or quad gpus? Because if thay aren't there is No way that AMD's processors are crippling them. And if you are judging them by this standard the i3 is crap and the i5 is crap because they ALL would bottleneck a gpu heavy configuration. You don't seem to understand that AMD isn't trying to compete with the i7 because they know they can't. They are trying to compete with low end intel processors which they do easily. As a budget processor AMD is king, this is not opinion it is fact. So obviously if your friends were planning in gaming with such a powerful system why would they use a budget processor? It is unfortunate that AMD can't keep up with intel though because intel's ivy bridge is an 8 core chip that they disabled two of the cores on. They wanted AMD to catch up so intel could pull a rabbit out of it's hat with an 8 core i7. Looks like that will never happen.
m
0
l
a c 152 à CPUs
October 23, 2012 1:49:03 PM

Once again you are wrong. It is a fact and has been proven in benchmark after benchmark that on high resolution the Bulldozer will bottleneck a high end GPU. Also for the second part you are wrong again. No longer is AMD great for a budget build and no they are not king if you are on a budget. You can get and I3 or Pentium G and itwill run circles around AMD's CPU's and APU's. They'll run even stomp the 8 core Bulldozer in pretty much all games. There's no point in
getting a Bulldozer when you can get a Pentium G or I3 for cheaper and it outperforms it. Sorry but again you are wrong.
m
0
l
October 23, 2012 2:12:39 PM

I legitimately don't understand your argument. You're saying that a Pentium or an i3 won't bottleneck a high end GPU but an FX processor would. I beg you to find ANY reputable evidence that shows that a single GPU could be bottle necked by an FX processor. You also don't take into account FX processor's power user friendliness. When's the last time you overclocked a pentium? Also, there are PLENTY of benchmarks that show bulldozer beating out similarly price cpus... For example: http://www.tomshardware.com/reviews/fx-4170-core-i3-322...
Please stop taking the side of intel without backing it up with facts. I understand that intel EASILY beats AMD at higher price points but at the budget side of the world, AMD wins. It'll be ok rich intel fanboy; AMD isn't taking over the world.
m
0
l
a c 152 à CPUs
October 23, 2012 2:31:22 PM

kajunchicken said:
I legitimately don't understand your argument. You're saying that a Pentium or an i3 won't bottleneck a high end GPU but an FX processor would. I beg you to find ANY reputable evidence that shows that a single GPU could be bottle necked by an FX processor. You also don't take into account FX processor's power user friendliness. When's the last time you overclocked a pentium? Also, there are PLENTY of benchmarks that show bulldozer beating out similarly price cpus... For example: http://www.tomshardware.com/reviews/fx-4170-core-i3-322...
Please stop taking the side of intel without backing it up with facts. I understand that intel EASILY beats AMD at higher price points but at the budget side of the world, AMD wins. It'll be ok rich intel fanboy; AMD isn't taking over the world.


Yes Bulldozer will bottleneck a high-end video card I really don't see how that is so hard to understand. Also no an I3 wont bottleneck something like a GTX 670. The main reason is Bulldozers crappy slow IPC. Intel CPU's can do more work faster and more efficiently that AMD's Bulldozer or APUs. For your second part you are right. You have to overclock the Bullldozer to nearly 5 GHz to get it anywhere close to competing with Sandy Bridges let alone Ivy Bridges. No you can't overclock an I3 but at stock clock for clock the I3 stomps the Bulldozer even the great 8xxx Bulldozers. No one said Bulldozer can't beat Intel but its far and few between and when it does it's barely. Its easy to cherry pick a few benchmarks how about you look at the vast majority out there that show Bulldozer for what it is absolute crap.

LOL all I can do is laugh at the last part.



Must be because you're throwing out the Intel fanboy insults.
m
0
l
a b à CPUs
October 23, 2012 2:38:12 PM

kajunchicken said:
Yea, the reason to disable the cores is because Windows 7 SUCKS at prioritizing the threads. For example, if you have two threads running, both of them will run on the same module instead of sending one thread to one module and sending another thread to another module.


Windows loads cores sequentially, unless the HTT CPUID bit is set, indicated a HTT core. On Intel's implementation, the Odd numbered cores (1,3,5,7) are the HTT cores. If the HTT bit is set, Windows will load those cores last.

As AMD chose not to re-use the already existing HTT bit, Windows has no clue that the second BD core has a ~20% performance penalty. Simply setting that CPUID field would solve ALL of BD's scheduling problems.
m
0
l
a c 152 à CPUs
October 23, 2012 2:44:03 PM

gamerk316 said:
Windows loads cores sequentially, unless the HTT CPUID bit is set, indicated a HTT core. On Intel's implementation, the Odd numbered cores (1,3,5,7) are the HTT cores. If the HTT bit is set, Windows will load those cores last.

As AMD chose not to re-use the already existing HTT bit, Windows has no clue that the second BD core has a ~20% performance penalty. Simply setting that CPUID field would solve ALL of BD's scheduling problems.


Yes exactly which is why when you disable four cores it helps a little bit, not much but a litte. You pretty much turn it into a "real" quad core CPU.
m
0
l
a b à CPUs
October 23, 2012 2:46:13 PM

kajunchicken said:
I beg you to find ANY reputable evidence that shows that a single GPU could be bottle necked by an FX processor.


Done:



CPU bottleneck.



Its not until you hit 2560x1600 do you run into a GPU bottleneck.



CPU bottleneck.

The situation gets worse when you look at frame latency rather then FPS:

http://techreport.com/review/23246/inside-the-second-ga...





So please, stop saying BD isn't bottlenecking GPU's. It is.


Quote:
You also don't take into account FX processor's power user friendliness.




m
0
l
a b à CPUs
October 23, 2012 2:47:21 PM

These threads always end up in a "my shiz is better than yours" retort and meme's.....I think they need to remove these threads before they get out of hand.
m
0
l
a c 152 à CPUs
October 23, 2012 2:54:19 PM



Yes thankyou. The ones I had saved on photobucket were deleted so they weren't hosted anymore.
m
0
l
October 23, 2012 6:50:37 PM

I don't understand what you're saying... At frames that high, the CPU is bottlenecking the GPU in ALL of the situations. It may bottleneck less on Intel processors but it's still a bottleneck. I also notice that none of those intel processors are in the same price segment as the AMD ones. Since when did you need more than 30 fps? Does your eye have a magical ability to detect more frames than mine? At 80fps your monitor can't even keep up...

I also appreciate the immature response of a meme. I'm not mad, I'm just arguing that your facts are irrelevant. You're right if we lived in a world where all that mattered was synthetic benchmarks but in the real world, an FX processor would propose so much of a bottleneck that you couldn't get playable FPS.

I'm still disappointed that this thread turned into a completely impertinent argument about whether AMD is better than Intel. All I can say is, it will be a sad day when AMD goes out of business and computer buyers have to pay even more for their processors.
m
0
l
a b à CPUs
October 23, 2012 7:05:53 PM

Quote:
I beg you to find ANY reputable evidence that shows that a single GPU could be bottle necked by an FX processor.


I simply responded. And the frame latency point is perfectly valid, regardless of FPS, since that affects how many FPS are actually drawn to the screen (versus created). Any value over 16.67ms indicates dropped frames, regardless of how many frames are being created in total.

And yes, any time you increase the power of the processor, and FPS increases, you had a CPU bottleneck. Though with how GPUs work now, its getting harder to distinguish (eg: Even though theres a clear CPU bottleneck, a more powerful GPU would still increase FPS due to IPC improvements.)
m
0
l
a c 283 à CPUs
October 23, 2012 7:12:29 PM

kajunchicken said:
At frames that high, the CPU is bottlenecking the GPU in ALL of the situations.


That doesn't even make any sense.

Since when did high frame rates actually produce and introduce a bottleneck?

At resolutions that high, games are mostly GPU bound anyway, but if you have a slow CPU, it'll still rear its ugly head.

Now, to your point that anything over x FPS doesn't matter, at 61 FPS or more, I agree with you, but that's the argument that every AMD fanboy always makes, lol. True that it doesn't matter or not, it's just another way for AMD fanboys to feel better.

Having said that last bit, it should be well known by now that I'm 100% neutral, but the arguments are always fun to watch, lol.
m
0
l
a c 152 à CPUs
October 23, 2012 8:04:03 PM

kajunchicken said:
I don't understand what you're saying... At frames that high, the CPU is bottlenecking the GPU in ALL of the situations. It may bottleneck less on Intel processors but it's still a bottleneck. I also notice that none of those intel processors are in the same price segment as the AMD ones. Since when did you need more than 30 fps? Does your eye have a magical ability to detect more frames than mine? At 80fps your monitor can't even keep up...

I also appreciate the immature response of a meme. I'm not mad, I'm just arguing that your facts are irrelevant. You're right if we lived in a world where all that mattered was synthetic benchmarks but in the real world, an FX processor would propose so much of a bottleneck that you couldn't get playable FPS.

I'm still disappointed that this thread turned into a completely impertinent argument about whether AMD is better than Intel. All I can say is, it will be a sad day when AMD goes out of business and computer buyers have to pay even more for their processors.


Everything you said is complete crap. First off the minimum playable FPS is 50 to 60 FPS
30 FPS would be considered by most unplayable frame rates. So the rest of your argument there is pointless and irrelevant nice try though trying to lower the number to male your point. Next you aren't going to convince me that you can't tell the difference between 50 FPS and 100+ FPS especially in first person shooter games. Moving turning to shoot and shooting is a lot faster and smoother on higher FPS. I don't know about some people but I can definitely tell the difference.
And lastly no one said Bulldozer can't get playable FPS in games in many cases people will point out that AMD processors do give playable performance. With that said not going to recommend a lower-performing CPU for gaming because you'd prefer we ignore the lack of performance and GPU bottlenecks. Your argument basically is that we should recommend lower-performing processors of the same price for a specific task because benchmarks don't favor your favorite brand? That's what I call fanboyism.
m
0
l
October 23, 2012 9:19:51 PM

Just so you know, movies play at 24 fps. Do you know why? Because the human perceives this as fluid motion. Therefore, anything above 30 frames per second is unnecessary.

Secondly, I KNOW that intel's ivy bridge is better. There is zero competition between intel and AMD once you are willing to spend more than $200.

I still don't know what you're saying about FX's bottleneck.

http://media.bestofmicro.com/Z/4/357664/original/battle...

This clearly shows no preference to intel's ivy bridge, sandy bridge, or AMD's bulldozer or pile driver.

This is a game that is well multithreaded, unlike your wisely chosen Skyrim which is basically a console port and uses only one core.

At some point, you have to stop bending the facts in your favor and realize that it really doesn't matter for gaming what processor you get. Intel or AMD, you get better single core performance from Intel and better multicore performance from AMD (until you get to the i7's in which the intels win hands down).

I'd also like to tell you that no processor can justify a price over about $200 because they can easily keep up with any current GPU. I would also like to point out that for October, AMD made its way back into the recommended processors at the $125 price point, tying with the i3. So yes, they are good budget processors, and no, they won't provide the same gaming performance as a higher end i5, especially in CPU heavy games.

I also like the first line, "Everything you said is complete crap." Especially since you went on to agree with what I was talking about. Very good argument skills. I tip my hat to that.
m
0
l
a c 283 à CPUs
October 23, 2012 9:53:17 PM

kajunchicken said:
I still don't know what you're saying about FX's bottleneck.

http://media.bestofmicro.com/Z/4/357664/original/battle...

This clearly shows no preference to intel's ivy bridge, sandy bridge, or AMD's bulldozer or pile driver.

This is a game that is well multithreaded, unlike your wisely chosen Skyrim which is basically a console port and uses only one core.


You complain about someone else cherry picking their bench and then, you go right ahead and do the same, lol.

That's the one gaming bench out of that whole article that shows that they're even. AND that's not even the multiplayer bench (which they described that they wouldn't do because it would be inaccurate).

I say that because that bench in particular isn't highly threaded. Yes, that basically makes your point for you, to some extent, but not for the reason that you said. Multithreading had nothing to do with that bench.

And CPU's over $200 are worth it for many people (myself included), just maybe not for only gaming. If gaming is your only goal, and you're on a tight budget, AMD is perfect. Otherwise, if you have the money for Intel, there's not much reason to go with AMD.

AMD isn't "crap", but I'll gladly pay the premium for better performance. Not fanboyism, just a want for the best. I would buy AMD, if they were better, even if it was more expensive than Intel...
m
0
l
a b à CPUs
October 23, 2012 10:06:10 PM

gamerk316 said:
Quote:
I beg you to find ANY reputable evidence that shows that a single GPU could be bottle necked by an FX processor.


I simply responded. And the frame latency point is perfectly valid, regardless of FPS, since that affects how many FPS are actually drawn to the screen (versus created). Any value over 16.67ms indicates dropped frames, regardless of how many frames are being created in total.

And yes, any time you increase the power of the processor, and FPS increases, you had a CPU bottleneck. Though with how GPUs work now, its getting harder to distinguish (eg: Even though theres a clear CPU bottleneck, a more powerful GPU would still increase FPS due to IPC improvements.)

you do realize that their "latency" is just SPF right?

16.67 ms/ frame = 60 fps

all they are looking at is how often it drops below 60 fps, that could be 59 fps, omg the horror, I lost a single frame. Intel owns amd because 60 fps >>>>> 59 fps.

its nothing new at all, just another way of saying the exact same thing. LATENCY = FPS.

But sure, looking at a game that contains Intel software somehow runs better on Intel hardware. quite a concept, must mean that AMD sucks because its AMD and has nothing to do with the software code.
m
0
l
October 23, 2012 11:40:13 PM

DJDeCiBeL said:
If gaming is your only goal, and you're on a tight budget, AMD is perfect. Otherwise, if you have the money for Intel, there's not much reason to go with AMD.

AMD isn't "crap", but I'll gladly pay the premium for better performance. Not fanboyism, just a want for the best. I would buy AMD, if they were better, even if it was more expensive than Intel...


Thank you. This is basically what I've been trying to say the whole time. Can everyone finally agree with this statment so we can stop arguing like AMD is the democrats and Intel is the republicans?
m
0
l
!