Sign in with
Sign up | Sign in
Your question
Closed

Are any bull CPUs worth buying?

Last response: in CPUs
Share
May 10, 2012 8:32:56 PM

I've always been partial to AMD and I like the prices of bulldozer CPUs (and associated motherboards), but the performance lags notably behind Intel. It seems in the few years I've been away from computing hardware AMD has lost its way a little. Are there any bulldozer chips that represent particularly good value?

And what would a 6 core bulldozer compare to in the intel architecture? From benchmarks it would seem they compare to (and slightly beat) Intel quad cores.

More about : bull cpus worth buying

a c 93 à CPUs
May 10, 2012 8:41:18 PM

The 6 core bulldozers actually tend to fall behind intel's current quad core offerings, even when the software is heavily threaded. THe FX 8120 and 8150 are competitive with the current i5 and i7 quad core CPUs in certain heavily threaded tasks that can take advantage of the higher core count. Forr the less threaded tasks the quad core intel offerings vastly outperform AMD's 8 core CPUs. For most day to day tasks the Bulldozer CPUs fall way behind intel, as there are very few programs that are designed to use more than 4 cores, a lot of programs still use only 1 or 2 cores.

I suppose the FX 4100 is probably the best value overall if you really need to have a quad core and have a very limited budget, like say only $120 for a CPU. In certain heavily threaded tasks it can outperform the similarly priced Core i3 2100, which is a dual core with hyperthreading. The 8 core models may also be worthwhile if you have software that takes particular advantage of the high core count.
May 10, 2012 8:45:42 PM

ive heard from people I know who bought the bull and they told me that the cpu is very disappointing compared to the intel cpus.

ill say go for the intel 2600k (personally) but the bulldozer is alot cheaper than the intel, if your budget wont allow the intel and dont forget you need to buy a new mobo aswell with the intel.

then go for the amd if you dont have allot in your budget
Related resources
a b à CPUs
May 10, 2012 8:49:01 PM

The less "failish" of the bunch would have to be the 8120 if you're willing to do some mild OC before melting the MoBo, haha. Price wise, it's not such a bad purchase if you already have an AM3+ MoBo and something lower than a Phenom II 965 with no OC.

Cheers!
May 10, 2012 9:00:46 PM

I am thinking that my next build should involve a nice socket 1155 motherboard and the cheapest celeron processor I can put on it. You can get a 2.6GHz celeron for 36 euros, so it wont cost much as a disposable stepping stone to a 2500K. It's a shame though - I've always liked AMD.
a b à CPUs
May 10, 2012 9:34:15 PM

You could get a Pentium G860 or something in that ballpark if you can.

It has some good gaming capabilities and performs just a tad slower than an i3.

Cheers!
May 10, 2012 11:55:52 PM

There's nothing wrong with going AMD. What it comes down to is the user experience. At the end of the day, you're not gonna care if the Intel CPU got 100 fps and your AMD CPU got 80 fps. You'll save money and at the same time won't be able to tell a difference. This is assuming it's for gaming. There may be things that Intel CPU's have a noticeable advantage, but again, you gotta take into account your own personal user experience. Personally, I'd rather save money and just have an overall quality machine.
May 11, 2012 12:03:11 AM

For gaming: Does it really matter what the framerate is, if the game play is smooth?

For MultiMedia: Does it really matter if it gets lower benchmark scores, if it browses the internet and plays media just as well as more expensive processors?
a b à CPUs
May 11, 2012 12:27:46 AM

What tue framerate is determines how smooth it is. And i disagree with above poster, you CAN tell the difference between 80 and 100fps.
I dont see any reason to biy bulldozer when in virtually every task the intel equivalent is cheahper and more powerful.
a c 131 à CPUs
May 11, 2012 12:32:47 AM

Terry1212 said:
At the end of the day, you're not gonna care if the Intel CPU got 100 fps and your AMD CPU got 80 fps. You'll save money and at the same time won't be able to tell a difference.

Since Intel has faster chips than AMD at nearly every price point, AMD is a gray area even for cost-cutting unless you want to use their IGP which is twice as fast as Intel's best.
May 11, 2012 9:07:46 PM

welshmousepk said:
What tue framerate is determines how smooth it is. And i disagree with above poster, you CAN tell the difference between 80 and 100fps.
I dont see any reason to biy bulldozer when in virtually every task the intel equivalent is cheahper and more powerful.

On a 60hz monitor, there is no difference between 80 and 100fps....the output framerate is still only 60fps as that's all the monitor can actually display. The only time the difference between 80 and 100fps actually makes a difference is if using a 120hz monitor. So, the "above poster" is completely accurate.
a b à CPUs
May 11, 2012 9:43:13 PM

Tell that to my eyes/hands, which can very much tell/feel a difference between 90-120FPS and 280-300FPS on CS:S. 60hz monitor. Your eyes may not be able to "see it", but the "feel" is definitely there.
a b à CPUs
May 11, 2012 9:57:55 PM

Raidur said:
Tell that to my eyes/hands, which can very much tell/feel a difference between 90-120FPS and 280-300FPS on CS:S. 60hz monitor. Your eyes may not be able to "see it", but the "feel" is definitely there.


You can't tell the difference using that metric.

Frames per Second don't tell the whole story here.

Read this up: http://techreport.com/articles.x/21516

Cheers!
a c 131 à CPUs
May 11, 2012 10:45:57 PM

Yuka said:
You can't tell the difference using that metric.

Frames per Second don't tell the whole story here.

Read this up: http://techreport.com/articles.x/21516

Agreed, I would pick stutter-free 40-60fps over stuttery 40-300fps any day.
May 11, 2012 10:59:03 PM

Raidur said:
Tell that to my eyes/hands, which can very much tell/feel a difference between 90-120FPS and 280-300FPS on CS:S. 60hz monitor. Your eyes may not be able to "see it", but the "feel" is definitely there.

Regardless of the FPS being reported, a 60hz display has a maximum output of 60fps. Period. This means every frame over that 60fps barrier is omitted from display. Due to how monitors work, it's 100% impossible to see/feel any difference between framerates once you exceed 60fps. The reported FPS is simply what the graphics card is rendering. It has no relation at all to what is being displayed. You simply want to believe it "feels" smoother and convince yourself it's true.....even though it's not.
a b à CPUs
May 11, 2012 11:49:53 PM

Terry1212 said:
There's nothing wrong with going AMD. What it comes down to is the user experience. At the end of the day, you're not gonna care if the Intel CPU got 100 fps and your AMD CPU got 80 fps. You'll save money and at the same time won't be able to tell a difference. This is assuming it's for gaming. There may be things that Intel CPU's have a noticeable advantage, but again, you gotta take into account your own personal user experience. Personally, I'd rather save money and just have an overall quality machine.


You are absolutely correct. Beyond a certain point, you just won't notice the difference. But that's where aging comes into play. While today's games will run at (for argument's sake) 80 fps on your build, tomorrow's may run at ~30 fps then ~15 fps and so on. Said Intel processor may get 100 fps today, and later on a still-playable ~50 fps. Without upgrading your system, you will be limiting the amount of time that your system is good for recent games. (But the money you save is for upgrades, right?! :D  )
a c 131 à CPUs
May 12, 2012 12:43:05 AM

sykozis said:
Regardless of the FPS being reported, a 60hz display has a maximum output of 60fps. Period.

Yes and no.

This is assuming the graphics card is able to render the vast majority of frames in under 20ms to maintain a nearly perfect illusion of continuous movement. Ridiculously high FPS at least indicate that the brute force to pull this off is present but as the TechReport article points out, some obscure factors sometimes cause some frames to take much longer than average to complete and cause perceivable stuttering even on high-end GPUs and SLI configurations... and Radeons seem to be the worst offenders there.
a b à CPUs
May 12, 2012 1:19:38 AM

sykozis said:
Regardless of the FPS being reported, a 60hz display has a maximum output of 60fps. Period. This means every frame over that 60fps barrier is omitted from display. Due to how monitors work, it's 100% impossible to see/feel any difference between framerates once you exceed 60fps. The reported FPS is simply what the graphics card is rendering. It has no relation at all to what is being displayed. You simply want to believe it "feels" smoother and convince yourself it's true.....even though it's not.


Incorrect. Higher framrate= faster rendering. equals more control.
This is why console games that require precision control (forza for instance) render the video at 30fps but the physics at 240fps.

yes it won't look any smoother beyond 60, but it will feel smoother because you have more individual frames in which to make your input and control the outcome.

Or do you think that console devs are stupid and just making stuff up?
Why do you think pro gamers also go for the highest FPS possible, even when on n60hz monitors?

I absolutely assure you i can tell a difference between 60 and 120fps on a 60hz monitor.
a c 131 à CPUs
May 12, 2012 1:52:50 AM

welshmousepk said:
I absolutely assure you i can tell a difference between 60 and 120fps on a 60hz monitor.

Run fraps and I bet you will find out that the reason you can "tell a difference" is because your setup regularly fails to render some frames under 16.67ms with vsync on and what you are actually noticing is the stutter when this happens, not the actual extra FPS itself.
a b à CPUs
May 12, 2012 2:11:16 AM

nope, it just feels generally smoother due to the higher framerate.

I'm not just making stuff up, console devs regularly render physics engines at a higher framerate than video. Why would they do this if it couldn't possibly make it smoother?

A good way to test this is to switch your monitor to 24hz mode. If you still get 60fps, it will look choppy but feel MUCH smoother than the same game at 30 frames per second on a 60hz monitor.
a c 146 à CPUs
May 12, 2012 2:52:08 AM

I would say there are no Bulldozers worth buying especially if you're planning on gaming.
a b à CPUs
May 12, 2012 6:31:07 AM

welshmousepk said:
Incorrect. Higher framrate= faster rendering. equals more control.
This is why console games that require precision control (forza for instance) render the video at 30fps but the physics at 240fps.

Or do you think that console devs are stupid and just making stuff up?
Why do you think pro gamers also go for the highest FPS possible, even when on n60hz monitors?


Unless OP plans on being a pro gamer, considering a BD purchase is a rational decision.
a b à CPUs
May 12, 2012 6:45:56 AM

anxiousinfusion said:
Unless OP plans on being a pro gamer, considering a BD purchase is a rational decision.


I still don't understand this logic.

BD= less performance for the same money.

regardless of if you will use the performance, why wouldn't you?

Don't get me wrong, I'm an AMD fan who used AMD chips ion nearly every build until SB came out. but SB has since totally wiped the floor with everything AMD has released.
when it was Phenom 955 vs i5 750, AMD had the clear advantage for price. But with SB you look to save maybe 40 dollars for a considerably lower performing chip.
a c 146 à CPUs
May 12, 2012 7:58:15 PM

welshmousepk said:
I still don't understand this logic.

BD= less performance for the same money.

regardless of if you will use the performance, why wouldn't you?

.


Exactly what I was thinking. Which is why I don't understand why some people are so hellbent on buying the Bulldozer anyway.
a c 131 à CPUs
May 12, 2012 8:17:44 PM

rds1220 said:
Exactly what I was thinking. Which is why I don't understand why some people are so hellbent on buying the Bulldozer anyway.

Someone has to contribute to AMD's wellfare plan. Since we need AMD to stay afloat to avoid ending up with an x86 CPU monopoly, people should cut AMD fanboys some slack. If they still want to buy AMD despite benchmarks saying Intel currently has the best bang-per-buck at nearly every price point, let them.
a b à CPUs
May 12, 2012 10:03:44 PM

rds1220 said:
Exactly what I was thinking. Which is why I don't understand why some people are so hellbent on buying the Bulldozer anyway.


Not at all, there are some key areas in computing where BD actually excels. If you work with Linux and Server applications, you'll look at what AMD has to offer with another perspective. That is, for the right price. The 8150 is actually overpriced IMO, but the 8120 is a good purchase for a lot of scenarios.

People going nuts saying "BD is crap", is just thinking about windows and games, most probably. There's a lot more to computing than games and Windows.

Cheers!
a c 146 à CPUs
May 12, 2012 10:13:47 PM

Yea and I wouldn't use it for a server either. If I remeber right after the first failure with the first release people said it was designed for servers bla bla bla and it will make up for it's bad performance in the server field. Then Bulldozer was tested in the server field too and it failed there getting beaten out by Intels Xeon's.
May 12, 2012 10:24:38 PM

This is just my opinion on what I have read and seen online, but it seems that the users that bought a BD series cpu were very happy with it (specifically the 8150), people liked it on newegg if I am not mistaken. The critics or online reviewers were not thrilled and I can understand why most software applications do not use all the cores like a bunch of people have already stated. My advice would be what is your price point and what do you plan to use the computer for; if you are a high end gamer I heard the cpu gets bottlenecked during gaming at intense levels so I wouldn't suggest it. Personally I would stick to a 4 core cpu right now if you can get a good price on an intel chip go for it because they are obviously superior, but I have always been an AMD fan and have never been disappointed with what I use my computer for (moderate gaming, web browsing, multimedia purposes). At this point in time the new AMD piledriver cpus are coming out very soon I believe so if you want to wait and see how they fair it might be a good idea.
a b à CPUs
May 12, 2012 10:51:14 PM

rds1220 said:
Yea and I wouldn't use it for a server either. If I remeber right after the first failure with the first release people said it was designed for servers bla bla bla and it will make up for it's bad performance in the server field. Then Bulldozer was tested in the server field too and it failed there getting beaten out by Intels Xeon's.


A lot of water has passed under the bridge by now. In Linux, BD starts to look better now. Well, that is if you can re-compile your program :p 

I wonder if in Windows we'll see more versions supporting BD fully now that Pilediver is around the corner (Trinity).

Cheers!
May 13, 2012 3:50:44 AM

welshmousepk said:
Incorrect. Higher framrate= faster rendering. equals more control.
This is why console games that require precision control (forza for instance) render the video at 30fps but the physics at 240fps.

yes it won't look any smoother beyond 60, but it will feel smoother because you have more individual frames in which to make your input and control the outcome.

Or do you think that console devs are stupid and just making stuff up?
Why do you think pro gamers also go for the highest FPS possible, even when on n60hz monitors?

I absolutely assure you i can tell a difference between 60 and 120fps on a 60hz monitor.


science says you cant
i call a form of the placebo effect - you know its doing 120 fps so you think its doing better than when its doing 60 fps
a b à CPUs
May 13, 2012 4:12:27 AM

heero yuy said:
science says you cant
i call a form of the placebo effect - you know its doing 120 fps so you think its doing better than when its doing 60 fps


what science exactly?

you seem to be using the 'i'm not listening' defense.

Why would devs purposely aim for higher the 60fps if it weren't possible? Why would almost every game cap the framerate in-engine at 90fps or higher?

Human reaction times are MUCH quicker than 1/60th of a second. Allowing more frames per second= more fractions of a second in which input can be made. THAT is basic science, and I would say it supports me fairly conclusively.

I've had this same argument so many times with people that think that what you see is the limit of human perception, but I assure you our brains can process information much more quickly than the perception of our eyes.
a c 131 à CPUs
May 13, 2012 4:36:46 AM

welshmousepk said:
Why would devs purposely aim for higher the 60fps if it weren't possible?

Devs don't "aim" for more than 60fps, they aim for playable rates on their recommended hardware configuration which is likely less than 60fps. Higher-end hardware just happens to be able to push rates beyond 60fps and few devs are bothering to set arbitrary upper limits.

welshmousepk said:
I've had this same argument so many times with people that think that what you see is the limit of human perception, but I assure you our brains can process information much more quickly than the perception of our eyes.

Running the game logic at 100+ iterations per seconds to improve game responsiveness is independent from the graphics side of things.

As far as perception is concerned, more than 60fps video is moot when your display device cannot physically display more than 60fps since you cannot perceive images that never make it on the screen.
a c 99 à CPUs
May 14, 2012 12:30:06 AM

Yuka said:
Not at all, there are some key areas in computing where BD actually excels. If you work with Linux and Server applications, you'll look at what AMD has to offer with another perspective. That is, for the right price. The 8150 is actually overpriced IMO, but the 8120 is a good purchase for a lot of scenarios.

People going nuts saying "BD is crap", is just thinking about windows and games, most probably. There's a lot more to computing than games and Windows.

Cheers!


^ That's absolutely spot on. Bulldozer does acceptably for Windows gaming if you pair it with a good GPU, but it isn't a standout. It does well in highly multithreaded Windows tasks such as video editing and file compression, generally outperforming the Intel quad-cores in its price range. However, no desktop Bulldozer is really in the same league as the six-core Intel desktop chips, either in terms of price or performance.

Now in servers and on Linux, Bulldozer shines. AMD doesn't charge an arm and a leg for multi-CPU-capable Opterons like Intel does for Xeons. The cheapest not heavily crippled dual-CPU-capable Xeon is the ~$400 Xeon E5-2620 (6 cores, 2.0/2.5 GHz.) For that amount of money, you can get TWO 8-core Opteron 4280s running at 2.8/3.5 GHz and have a little money left over. The price gap gets even bigger with 4-socket machines as AMD will sell you an 8-core Opteron 6212 running at 2.6/3.2 GHz for $266. The 12-core units start at $370 and you can get 16-core units for about $550. The least expensive 4-way Xeon is the 1.87 GHz six-core Westmere-based Xeon E7-4807 with no Turbo and an $890 price tag. Imagine an i7-980 running at half the clock speed and with no Turbo and that's roughly how an E7-4807 would perform. The decent 4-way E7s start at well over a grand and go all the way up to $4400 apiece. Ouch. Also, mere mortals can buy quad-socket Opteron boards, fill them with those relative inexpensive CPUs, and stuff them into very large cases. Just look at the "distributed computing" forum at [H], it's full of guys with Tyan S8812s and quad Supermicro G34 setups.They can't do that with Xeon MP boards as they are super expensive, super rare, and super proprietary. You pretty much have to buy a Xeon MP server to use Xeon MPs, and they are five-figure machines when fully populated with CPUs.

Bulldozer also does much better on Linux than on Windows. Phoronix has a bunch of benchmarks that shows that Bulldozer runs very nicely on Linux, due to the Linux kernel being tweaked to schedule threads appropriately for the CPU (one thread per module first, then two threads per module) and GCC 4.6/4.7 supporting a lot of Bulldozer optimizations. Michael isn't very keen to pit AMD vs. Intel CPUs directly against each other in his comparisons (lest he tee off somebody who sends him parts to review), but you can find plenty of people who do on the OpenBenchmarking.org site that uses the Phoronix benchmark software.

rds1220 said:
Yea and I wouldn't use it for a server either. If I remeber right after the first failure with the first release people said it was designed for servers bla bla bla and it will make up for it's bad performance in the server field. Then Bulldozer was tested in the server field too and it failed there getting beaten out by Intels Xeon's.


That mainly came from Anandtech's reviews of the Interlagos Opterons. I would take those reviews with a big grain of salt as they said over and over again that their tests are far from ideal as they used outdated OSes (vAPUS), they had program scaling issues (most of the DB tests), and didn't have the time to properly tune for the new Opterons (vAPUS again.) They ran a good number of programs that nobody would ever run on a server, such as Cinebench, which they noted was specifically tuned for Intel CPUs (used Intel-optimized OpenMP threading library) and didn't even support AMD's NUMA which has been around for Opterons for a whopping nine years. You notice there was a complete lack of HPC benchmarks in either of those tests, which is a market AMD does very well with. You'd really wonder what four 16-core Opteron 6200s could do with something like GROMACS compiled with GCC 4.7...
May 14, 2012 3:47:15 AM

InvalidError said:
Devs don't "aim" for more than 60fps, they aim for playable rates on their recommended hardware configuration which is likely less than 60fps. Higher-end hardware just happens to be able to push rates beyond 60fps and few devs are bothering to set arbitrary upper limits.


Running the game logic at 100+ iterations per seconds to improve game responsiveness is independent from the graphics side of things.

As far as perception is concerned, more than 60fps video is moot when your display device cannot physically display more than 60fps since you cannot perceive images that never make it on the screen.


Give up. Some people are ignorant enough to believe that all those extra non-displayed frames actually make a difference....
May 14, 2012 1:55:02 PM

^ they do make a difference
they cook your graphics card U.U
a b à CPUs
May 14, 2012 2:32:47 PM

Ags1 said:
I've always been partial to AMD and I like the prices of bulldozer CPUs (and associated motherboards), but the performance lags notably behind Intel. It seems in the few years I've been away from computing hardware AMD has lost its way a little. Are there any bulldozer chips that represent particularly good value?

And what would a 6 core bulldozer compare to in the intel architecture? From benchmarks it would seem they compare to (and slightly beat) Intel quad cores.



I believe in the saying: You got what you paid for. You already knew that Intel cpu's outperform AMD cpu's, then why would you stick on AMD? Piece of advice just save a little more on your budget and buy an Intel cpu, its worth it.
a b à CPUs
May 14, 2012 2:41:29 PM

sykozis said:
For gaming: Does it really matter what the framerate is, if the game play is smooth?

For MultiMedia: Does it really matter if it gets lower benchmark scores, if it browses the internet and plays media just as well as more expensive processors?



Frame rates (FRAM) It does matter. You don't want your game lagging because, you got a slow FRAM whille you are in the middle of a game. And not only that, even if you have a high FRAM if your monitor has slow response time, you will notice some horizontal lines during fast action scenes like rapid fires and explosions in the games.
a b à CPUs
May 14, 2012 2:51:26 PM

MU_Engineer said:
^ That's absolutely spot on. Bulldozer does acceptably for Windows gaming if you pair it with a good GPU, but it isn't a standout. It does well in highly multithreaded Windows tasks such as video editing and file compression, generally outperforming the Intel quad-cores in its price range. However, no desktop Bulldozer is really in the same league as the six-core Intel desktop chips, either in terms of price or performance.

Now in servers and on Linux, Bulldozer shines. AMD doesn't charge an arm and a leg for multi-CPU-capable Opterons like Intel does for Xeons. The cheapest not heavily crippled dual-CPU-capable Xeon is the ~$400 Xeon E5-2620 (6 cores, 2.0/2.5 GHz.) For that amount of money, you can get TWO 8-core Opteron 4280s running at 2.8/3.5 GHz and have a little money left over. The price gap gets even bigger with 4-socket machines as AMD will sell you an 8-core Opteron 6212 running at 2.6/3.2 GHz for $266. The 12-core units start at $370 and you can get 16-core units for about $550. The least expensive 4-way Xeon is the 1.87 GHz six-core Westmere-based Xeon E7-4807 with no Turbo and an $890 price tag. Imagine an i7-980 running at half the clock speed and with no Turbo and that's roughly how an E7-4807 would perform. The decent 4-way E7s start at well over a grand and go all the way up to $4400 apiece. Ouch. Also, mere mortals can buy quad-socket Opteron boards, fill them with those relative inexpensive CPUs, and stuff them into very large cases. Just look at the "distributed computing" forum at [H], it's full of guys with Tyan S8812s and quad Supermicro G34 setups.They can't do that with Xeon MP boards as they are super expensive, super rare, and super proprietary. You pretty much have to buy a Xeon MP server to use Xeon MPs, and they are five-figure machines when fully populated with CPUs.

Bulldozer also does much better on Linux than on Windows. Phoronix has a bunch of benchmarks that shows that Bulldozer runs very nicely on Linux, due to the Linux kernel being tweaked to schedule threads appropriately for the CPU (one thread per module first, then two threads per module) and GCC 4.6/4.7 supporting a lot of Bulldozer optimizations. Michael isn't very keen to pit AMD vs. Intel CPUs directly against each other in his comparisons (lest he tee off somebody who sends him parts to review), but you can find plenty of people who do on the OpenBenchmarking.org site that uses the Phoronix benchmark software.



That mainly came from Anandtech's reviews of the Interlagos Opterons. I would take those reviews with a big grain of salt as they said over and over again that their tests are far from ideal as they used outdated OSes (vAPUS), they had program scaling issues (most of the DB tests), and didn't have the time to properly tune for the new Opterons (vAPUS again.) They ran a good number of programs that nobody would ever run on a server, such as Cinebench, which they noted was specifically tuned for Intel CPUs (used Intel-optimized OpenMP threading library) and didn't even support AMD's NUMA which has been around for Opterons for a whopping nine years. You notice there was a complete lack of HPC benchmarks in either of those tests, which is a market AMD does very well with. You'd really wonder what four 16-core Opteron 6200s could do with something like GROMACS compiled with GCC 4.7...


+2 to this, it is nice to see a thread without the full-intel circlejerk whenever a AMD processor is mentioned...

Clearly this Zambezi persecution is a construct of Intel fans that just need something to go and spam chatboads with. This is just baiting and trolling of the highest order.


rds1220 said:
Exactly what I was thinking. Which is why I don't understand why some people are so hellbent on buying the Bulldozer anyway.


I was going to answer this, then I looked at the avatar handle, and the specs and decided it was rather not worth the effort. Why don't you use that fast Sandy of yours to google up something more logical to argue about, instead of resorting to the old rhetoric amongst the Intel crowd which is rather old and stale now. In actual fact why are you even bothering, if not for the baiting.

InvalidError said:
Someone has to contribute to AMD's wellfare plan. Since we need AMD to stay afloat to avoid ending up with an x86 CPU monopoly, people should cut AMD fanboys some slack. If they still want to buy AMD despite benchmarks saying Intel currently has the best bang-per-buck at nearly every price point, let them.


Is that not what you have done when buying your Intel. Oh the irony of one fan calling those that buy the competitors products "fanboys". Now you are not only a troll, but a hypocrite at the same time.
a b à CPUs
May 14, 2012 3:39:35 PM

Ags1 said:
I've always been partial to AMD and I like the prices of bulldozer CPUs (and associated motherboards), but the performance lags notably behind Intel. It seems in the few years I've been away from computing hardware AMD has lost its way a little. Are there any bulldozer chips that represent particularly good value?

And what would a 6 core bulldozer compare to in the intel architecture? From benchmarks it would seem they compare to (and slightly beat) Intel quad cores.


There is a cause for AMD Zambezi processors, but it largely depends on your needs, it is fact that the Intel sandybridge/ivybridges perform better, but the Zambezi will cost as low as $160 for a FX 8120 which offers comparible performance to the i5 processors at a reduced fee. If you are willing to trade off the so-called performance differential which is rather unnoticable.


a c 131 à CPUs
May 14, 2012 3:48:33 PM

sarinaide said:
Is that not what you have done when buying your Intel. Oh the irony of one fan calling those that buy the competitors products "fanboys".

I'm not an AMD fanboy, the only AMD CPU I have ever bought is in my laptop, all my other PCs are Intel builds all the way back to the 8088.
May 14, 2012 3:57:40 PM

anxiousinfusion said:
You are absolutely correct. Beyond a certain point, you just won't notice the difference. But that's where aging comes into play. While today's games will run at (for argument's sake) 80 fps on your build, tomorrow's may run at ~30 fps then ~15 fps and so on. Said Intel processor may get 100 fps today, and later on a still-playable ~50 fps. Without upgrading your system, you will be limiting the amount of time that your system is good for recent games. (But the money you save is for upgrades, right?! :D  )


This is assuming that games will remain using 4 cores for the majority of the time. If say the next generation of games standard is 6 cores, then the AMD processors "Might" preform better. Its a gamble either way
May 14, 2012 4:04:40 PM

welshmousepk said:
I still don't understand this logic.

BD= less performance for the same money.

regardless of if you will use the performance, why wouldn't you?

Don't get me wrong, I'm an AMD fan who used AMD chips ion nearly every build until SB came out. but SB has since totally wiped the floor with everything AMD has released.
when it was Phenom 955 vs i5 750, AMD had the clear advantage for price. But with SB you look to save maybe 40 dollars for a considerably lower performing chip.


Less performance for same cost is a bit of an stretch. At my local store, A FX 8120 + a sabertooth 990fxa motherboard is $295. A 2500k + Sabertooth z77 is $370, and a 3570k + Sabertooth z77 is $380. The bulldozer is more then 20% less expensive than the intel offering, so saying that bulldozer gives the same performance for the same cost is a lie.
a b à CPUs
May 14, 2012 4:11:37 PM

InvalidError said:
I'm not an AMD fanboy, the only AMD CPU I have ever bought is in my laptop, all my other PCs are Intel builds all the way back to the 8088.


Errr...thats the point, to trash on AMD users as being "fanboys" when a AMD can do all a Intel CPU can, if only a little slower, yet also buying only Intel products is rather naive and hypocritical. We buy products to suit our needs, long standing AMD users will tend to stick with AMD as a platform of choice, unless there is a particular aspect which necessitates a change, in most instances this is null and void as even at the P/P bracket Intel doesn't offer a significant gap over the similarly priced rival chips.

You have a intel setup and you are happy with it(suppose) and we are all happy for you, but to make tactless remarks about AMD users as fanboys is not a contribution, it is trolling.
a c 131 à CPUs
May 14, 2012 5:09:05 PM

sarinaide said:
You have a intel setup and you are happy with it(suppose) and we are all happy for you, but to make tactless remarks about AMD users as fanboys is not a contribution, it is trolling.

What is tactless about calling people who buy AMD even though Intel has better overall performance at most price points fanboys for buying based on arbitrary preference rather than cold hard facts? I personally don't think "fanboy" has an intrinsically negative connotation and if you google for it, most of the definitions you find agree, there is nothing intrinsically pejorative about it.

Intel fanboys need AMD's fanboys to keep AMD alive so Intel can't afford to stagnate too much.
a c 146 à CPUs
May 14, 2012 5:16:44 PM

sarinaide said:
I was going to answer this, then I looked at the avatar handle, and the specs and decided it was rather not worth the effort. Why don't you use that fast Sandy of yours to google up something more logical to argue about, instead of resorting to the old rhetoric amongst the Intel crowd which is rather old and stale now. In actual fact why are you even bothering, if not for the baiting.



Is that not what you have done when buying your Intel. Oh the irony of one fan calling those that buy the competitors products "fanboys". Now you are not only a troll, but a hypocrite at the same time.


So I'm a fanboy why...because I state the facts and post benchmarks that prove how bad the Bulldozer is. You're a joke wake up and look at the facts the Bulldozer is a failure.
a b à CPUs
May 14, 2012 5:20:49 PM

rds1220 said:
So I'm a fanboy why...because I state the facts and post benchmarks that prove how bad the Bulldozer is. You're a joke wake up and look at the facts the Bulldozer is a failure.


You need context to the facts you put on the table. While I agree BD is not very good for a Windows environment against Sandy and Ivy, it is a good alternative priced correctly.

Cheers!

EDIT: Spelling :p 
a b à CPUs
May 14, 2012 7:24:30 PM

welshmousepk said:

Human reaction times are MUCH quicker than 1/60th of a second. Allowing more frames per second= more fractions of a second in which input can be made. THAT is basic science, and I would say it supports me fairly conclusively.

utter rubbish. http://www.humanbenchmark.com/tests/reactiontime/index....

average reaction time, 215 ms or roughly 1/5 of a second. fastest time is ~110 ms, or 1/10th of a second.

As for the topic, it amazes me how many people bought a 4100 and think its comparable to the 2500k and call ALL BD = crap. 4100 is 1/2 BD, yes games "generally" don't use more than 4 cores but disabling half of my 8120, ... fells clunky.

The 4100 is not a true BD, its just a cheap cpu, thats it, you can't even attempt to unlock it like the cheap PII 550s.

As MU and others have said, when a program is patched, sometimes its for newer technology. IE Skyrim.



pre patched game results

post patch



Picked up the game this weekend, can't find the v-sync setting but running fraps, it locks fps at 60, 1900x1200 ultra fxaa, 8xaa 16xaf

30 fps? don't make me laugh, this thing never dropped below 50 at the top of a mountain looking as far as I could. In a dungeon, never fell off of 60. Benchmark results never lie ... wrong. BD bottlenecks games ... at 40%-60% maximum core usage and 20% overall ... ya .. right.
a b à CPUs
May 14, 2012 8:30:43 PM

rds1220 said:
So I'm a fanboy why...because I state the facts and post benchmarks that prove how bad the Bulldozer is. You're a joke wake up and look at the facts the Bulldozer is a failure.


You only have to read your posts to see why, but ultimately seeing no fault in your product, only focusing on AMD faults, if you cannot see anything in the competitors products but the opportunity to rip it off that makes you a fanboy(it was the reference to bulldozer that lured you here). Then there was the fact that you have taken umbrage to me calling you a fanboy is rather implicit way of suggesting that you are.


a c 146 à CPUs
May 14, 2012 10:59:48 PM

sarinaide said:
You only have to read your posts to see why, but ultimately seeing no fault in your product, only focusing on AMD faults, if you cannot see anything in the competitors products but the opportunity to rip it off that makes you a fanboy(it was the reference to bulldozer that lured you here). Then there was the fact that you have taken umbrage to me calling you a fanboy is rather implicit way of suggesting that you are.


That's funny because I've openly critized the Ivy Bridges for not giving much more performance and running hotter than the Sandy Bridges. Maybe someone has to learn to read??
a b à CPUs
May 15, 2012 5:06:23 AM

Bulldozer is great for apps that can utilize 8 threads efficiently.

To you AMD airboys gamers running resolutions under ~2500x1600...

take note, this is with first generation i-core CPUs (i7-920)





.<---period
!