Sign in with
Sign up | Sign in
Your question

Why do people say 8350 is a four core processor?

Last response: in CPUs
Share
a b à CPUs
February 2, 2013 3:36:10 AM

Reading through the architecture for AMD Bulldozer and Piledriver processors, it is clear that each processor module has the hardware to handle 2 simultaneous threads on their own pipeline, except in the rare case of a 256 bit Floating point calculation where it has to share the two FPU units to perform the operation. Every other operation has a full path on it's own hardware through the CPU and 2 threads can pass without contention through each module.

As for sharing L2 cache, if you give each pair 2MB to share instead of 1MB of dedicated, there is no performance hit. L3 cache is shared at Intel, too.

It feels like everyone is missing these facts, but maybe its me. Why would someone say that AMD does not use "real" cores?
a c 479 à CPUs
February 2, 2013 3:43:48 AM

I never heard of the FX-8350 being referred to as a 4 core CPU until this thread.
February 2, 2013 4:30:11 AM

This sounds similar to what I heard about the FX-8150 when it first came out; while it was technically an eight core cpu, they were all actually part of two "monolithic" cores. AMD probably decided to put an additional mini core inside each core, probably with the hope of boosting performance (which it apparently did).

Related resources
a b à CPUs
February 2, 2013 4:46:41 AM

Because amd Cores arent like intels cores. That does not mean its not a real 8 core.
a c 103 à CPUs
February 2, 2013 4:51:15 AM

I class them as quads as well,
quads with a wierd hyperthreading type function, but quads nonetheless
its been said before but when coding takes advantage of it, thats when you'll see BD and PD start to perform
Moto
February 2, 2013 4:52:12 AM

People say that, mostly Intel fanboys, i am an regular Intel user and i say, it is actually 8 core CPU, best one in its price/performance range, you chose well. The force is strong with you.
a c 103 à CPUs
February 2, 2013 4:56:00 AM

If that was at me Austen, I'm no Intel fan :) 
Moto
a c 79 à CPUs
February 2, 2013 5:01:08 AM

Even i think its better to call them Quad Modules with 2 cores in each module for smt.
February 2, 2013 5:06:22 AM

Motopsychojdn said:
If that was at me Austen, I'm no Intel fan :) 
Moto

No, not at all, i was referring to people who usually support anything that Intel comes up with. Not you.
a c 152 à CPUs
February 2, 2013 5:25:01 AM

Motopsychojdn said:
I class them as quads as well,
quads with a wierd hyperthreading type function, but quads nonetheless
its been said before but when coding takes advantage of it, thats when you'll see BD and PD start to perform
Moto


I agree 100%.I would all it a quad core with something similar but not the same as hyperthreading.
a b à CPUs
February 2, 2013 5:50:43 AM

Do you guys know what Hyperthreading is? I throws two threads at one processor and if the processor has time to work on the extra thread while waiting for I/O or something, it executes the extra thread. Otherwise it gets kicked back and until the first thread finishes. It requires software support in the OS to work properly.

AMD has two distinct processors in each module that can work on two threads simultaneously. Yes there is only one Floating point scheduler, but it has two 128 bit FPU processing units so it can execute two Floating point operations at the same time, too.




a b à CPUs
February 2, 2013 5:55:39 AM

austenwhd said:
..... you chose well. The force is strong with you.


Me?

I have an overclocked i5 in my system, actually! :) 
February 2, 2013 6:01:17 AM

As I see it
1) 4 modules each with 2 cores
2) Poor thread assignment in Win7 (updated, yet correct for Intel) overloads first module with 2 threads instead of moving thread to next module
3) total somewhere between 4 core hyperthreaded and true 8 core, leaning toward apps over games, as multithreaded apps are more likely to use max/unlimited threads over a specific numbers.
Shared L2 effect debatable
a c 103 à CPUs
February 2, 2013 6:04:58 AM

austenwhd said:
No, not at all, i was referring to people who usually support anything that Intel comes up with. Not you.

No worries man :) 
** It requires software support in the OS to work properly**
Kinda what I said no?

Moto
February 2, 2013 6:35:22 AM

twelve25 said:
Me?

I have an overclocked i5 in my system, actually! :) 



Well. if the other posters are to be believed, the force has only grown stronger with you.
a b à CPUs
February 2, 2013 6:43:32 AM

abbadon_34 said:
As I see it
1) 4 modules each with 2 cores
2) Poor thread assignment in Win7 (updated, yet correct for Intel) overloads first module with 2 threads instead of moving thread to next module
3) total somewhere between 4 core hyperthreaded and true 8 core, leaning toward apps over games, as multithreaded apps are more likely to use max/unlimited threads over a specific numbers.
Shared L2 effect debatable


4 modules x 2 cores = 8 cores?
The module can execute two threads simultaneously so this should not be an issue. Hyperthreading cannot execute two threads at the same time, period. It's not even close to the same.



a b à CPUs
February 2, 2013 6:47:20 AM

Motopsychojdn said:

** It requires software support in the OS to work properly**


You said AMD would perform better with software support. It might, but not for the same reasons Hyperthreading needs software. Hyperthreading is, again, forcing two threads down one processor unit. Most of the time, this extra thread gets kicked back out and needs the OS to recognize that and work around it. AMD can actually process two threads at the same time, so no need for software support.

What could happen is that if Windows sent the first 1-4 threads each to a separate module, then those cores could have access to the full shared cache, and would likely perform a little bit better.



February 2, 2013 6:52:24 AM

See #2. Execution of two threads on a single module is not as effecient *usually* as two threads on two separate modules. AFAIK Windows assigns two separate threads to the same module with AMD vs. two threads two cores Intel, hyperthreading coming when the cores are used up.

I am by no means an expert and if I am wrong I am HAPPY to hear it, to justify an upgrade to a 8320 instead of i5-3770k

And I am using an AMD Phenom II.
a c 89 à CPUs
February 2, 2013 6:55:39 AM

they have 8 integer cores. the problem with that is things like games dont use integer cores a lot. It has only 4 FPU's (floating point units) and that is what games use. so it has 8 full integer cores, but have to share 1 fpu between 2 cores.
a b à CPUs
February 2, 2013 6:58:10 AM

It has 4 FPU schedulers each with Two 128 bit FPU processing units (FMAC). It can run a total of 8 FPU threads, albeit with some possible overhead vs having 8 FPU schedulers.
a b à CPUs
February 2, 2013 7:04:10 AM

abbadon_34 said:
See #2. Execution of two threads on a single module is not as effecient *usually* as two threads on two separate modules. AFAIK Windows assigns two separate threads to the same module with AMD vs. two threads two cores Intel, hyperthreading coming when the cores are used up.
.


You are correct. I was looking for something to prove it, and this article seems to: http://techreport.com/review/21865/a-quick-look-at-bull...

If you use CPU masks to trick Windows into thinking it has only 1 core in each of 2 modules, you get 20% more performance vs it using two threads in one module. This means something is a bottleneck in the architecture, but I still have a hard time saying it isn't a dual core module. It's 2 cores in one module bottlenecked by some kind of architecture restraints.
a c 79 à CPUs
February 2, 2013 7:18:46 AM

twelve25 said:
It throws two threads at one processor and if the processor has time to work on the extra thread while waiting for I/O or something, it executes the extra thread. Otherwise it gets kicked back and until the first thread finishes. It requires software support in the OS to work properly.

AMD has two distinct processors in each module that can work on two threads simultaneously. Yes there is only one Floating point scheduler, but it has two 128 bit FPU processing units so it can execute two Floating point operations at the same time, too.

true. thread for thread, AMD's smt through modules is nearly 4 times superior than HT. Thats simply because while HT can be hypothetically seen as a core woking at a max 20% efficiency, running two parallel threads, a BD module has one core at 100% and its sister core at 75-80%, with the performance hit coming from shared fpu scheduler and decoder.
February 2, 2013 9:28:11 AM

iam2thecrowe said:
they have 8 integer cores. the problem with that is things like games dont use integer cores a lot. It has only 4 FPU's (floating point units) and that is what games use. so it has 8 full integer cores, but have to share 1 fpu between 2 cores.


Ok, this is happening again, Intel and other companies advertise CPU(s) good or bad for gaming, trust me, i have discussed this before on other threads, they are not! A simple core 2 duo usually can get you through any extremely demanding games even these days with a efficient enough GPU, period. If you are buying a CPU (don't tell me for the iGPU, that is plainly stupid) for only playing games, buy a ivy i3 with any latest DX11 compatible Nvidia GTX , you are safe for next, atleast, 5 years.


IMO (this is first time i am using this!), most people, ie 99.99 % people buy a PC for all sorts of stuff, like browsing, chatting, downloading, watching blu ray rips (of pirate sites), doing video encoding (to upload on pirate sites) etc. etc. i won't care if i have a 8 core AMD or a similarly priced (or more) 4 core Intel, they are both efficient and capable for any or all of this at once. Before these multithreading days, modern computers were meant for multitasking, any 4 core (recent ones are always preferable) CPU these days is more than capable of doing that.

You don't buy High-end PC to play games, you buy a 8 Core, 4 Ghz+, 32 GB RAM, Z77 MB, GTX 670, configuration to do High-end HD video editing, 3D rendering etc. where nothing is good enough.

Multithreading is a term and technology introduced by Intel for Intel, doesn't make it industry standards.
a b à CPUs
February 2, 2013 4:14:00 PM

satyamdubey said:
true. thread for thread, AMD's smt through modules is nearly 4 times superior than HT. Thats simply because while HT can be hypothetically seen as a core woking at a max 20% efficiency, running two parallel threads, a BD module has one core at 100% and its sister core at 75-80%, with the performance hit coming from shared fpu scheduler and decoder.


It would be insteresting to see if the "sister" core is running at 80% of the "main" core, or if they just both run at 90% when they have to share tasks. I mean, is there really a slower core out of each pair or do they just slow when sharing paths into the module and cache. If you could some how send a singe thread through and test core 1 and core 2 individually, that would give the answer, but I am not sure how to do it.
a b à CPUs
February 2, 2013 4:19:32 PM

austenwhd said:
Ok, this is happening again, Intel and other companies advertise CPU(s) good or bad for gaming, trust me, i have discussed this before on other threads, they are not! A simple core 2 duo usually can get you through any extremely demanding games even these days with a efficient enough GPU, period. If you are buying a CPU (don't tell me for the iGPU, that is plainly stupid) for only playing games, buy a ivy i3 with any latest DX11 compatible Nvidia GTX , you are safe for next, atleast, 5 years.



You don't buy High-end PC to play games, you buy a 8 Core, 4 Ghz+, 32 GB RAM, Z77 MB, GTX 670, configuration to do High-end HD video editing, 3D rendering etc. where nothing is good enough.



C'mon, man, you don't really believe this, do you? I'd venture to say 4/5 high end PC self-builds are for games by computer nerds like us. And as for an i3 or Core2Duo being enough to handle a high end Nvidia card, I'd only have to suppose you've never tried it. It is not! BF3 uses 85% of all four cores on my Sandy i5 during multiplayer. There are countless benchmarks showing slower and dual core processors as not being able to hit vsync on many titles.





February 2, 2013 6:04:06 PM

twelve25 said:
C'mon, man, you don't really believe this, do you? I'd venture to say 4/5 high end PC self-builds are for games by computer nerds like us. And as for an i3 or Core2Duo being enough to handle a high end Nvidia card, I'd only have to suppose you've never tried it. It is not! BF3 uses 85% of all four cores on my Sandy i5 during multiplayer. There are countless benchmarks showing slower and dual core processors as not being able to hit vsync on many titles.


I use to work in a cyber cafe and we had core 2 duo on all system, our customers use to play latest games on those PC, even Xbox playstation games on emulator, there is a thing called 'lower settings' like playing at 30 fps (which is good enough), using lesser textures or details, not much just enough to make the motion and graphics smooth enough, when i say latest, i mean, whatever was on net, was played and done.

Spending 500 $ for a gaming console (only, really!) is not sensible, even for the most hardcore gaming fan, if you have money to throw around and no concerns for TDP (here goes global warming) i think buying a -

Intel Xeon E5-2690 @ 2.90GHz @ $2,068.99 paired with
GeForce GTX 680 @ $399.99

souce- http://www.cpubenchmark.net

will see you till the end of days for PCs.

But that is not logical, and here comes my point:

AMD FX-8350 @ $ 189.99 - the most expensive PC CPU from AMD (at least four cores), even if it has 4 physical cores that are paired with a pair of logical core each, is better performer in games and 3d and whatnot over...

Intel Core i5-3470 @ 188.79 - the most cheapest PC CPU from Intel with 4 physical cores, that is below AMD 8320, 8150, 8140...

AMD 8120 is above it and it is priced at $152.00! Can you see my point, you spend $189 and get the fastest AMD 'PC' CPU (Ranked at 40) or you can buy a mid range Intel CPU i5 3470 around the same price and ranks 110.

To make it more interesting let me tell you that the closest any CPU (above rank 40, and sadly only more expensive) reaches to 8350 is in fact Intel Xeon E3-1240 V2 @ $274.99 (rank 32) or the more mainstream Intel Core i7-3770 @ $289.99 (rank 31) and they both feature... wait for it... four physical core paired with two logical cores each.

So, logically if i have to choose between Intel Core i7-3770 @ $289.99 and AMD FX-8350 @ $ 189.99, i will opt for 8350, because then i will have enough money to actually buy a decent enough DX11 compatible GPU, maybe throw a few GB new RAM, too, i am good at bargaining.
a b à CPUs
February 2, 2013 6:46:44 PM

You "used" to work in a Cybercafe? I used to play the latest games on Core 2 when Core 2 was the latest processor.

I'm not even sure what you are arguing anymore, it doesn't seem to be the topic of this thread.

Passmark is nice for general, but you have to look at the tests of the actual programs you'll be using. Not very many real workloads match perfectly with a full synthetic that can hit all cores equally.

February 2, 2013 7:24:40 PM

twelve25, i 'used' to work in my cyber cafe that is now run by someone else, it still runs on C2D and it still rocks any game we throw at it, let me repeat, 'any game'. We have HD 6000 something GPU on all PCs, so that took care of all the graphics aspect. games dependency on CPU is greatly exaggerated. Even if game can take advantage of 4 cores, or more, that doesn't mean 2 core cpu cannot, run them, just like 64 bit After Effects can utilize 16 cores (to all that exist) but it runs juts fine with a C2D.

If you read my entire reply (which i accept is too long), you will realize that i tried to explain why an i7 is not always a trump over AMD CPU, price factor is important (almost decider). Don't get tricked by my status as youngster on this site, i am 24, but i have been around PC since the days of only command promt os and pentium 3 cpu's. I know what i know. I trained in computer animation and video editing, and since i quested to increase my knowledge about performance PCs, i am quiet obsessed about them, and i have studied extensively on this topic.

Passmark might be only reference but then i posted its stats here taking in account as 'reference only', you have to start some place, don't you? Now i kinda miss my C2D e7500. When i went to Adobe official site, in discussion forum people use to say 'you need 'atleast' a quad core with 'atleast' Gtx card to render video at more than 1080p'. I rendered 2 hours of footage @ 50 fps at a resolution of 3000something by 2000 something without a GPU... you don't want reference sites, here is a fact, from my own experience, C2D was kinda winXP of CPUs, the best even after all these years.
February 2, 2013 10:12:12 PM

Your argument seems to be based on the idea that turning down settings for FPS is acceptable.

Just because a system can play a game on lower settings doesn't mean that it's the proper way to play. It's true that too many people over-spend on processors compared to their GPU, but that doesn't make the higher-performing processors worthless.
a b à CPUs
February 2, 2013 10:38:45 PM

30fps is NO WAY PLAYABLE for me. Even on farcry 3 fully maxed I still feel when I hit 50fps. I. Hate playing on lower than 60 fps and I hate lowering the graphics. A core 2 Duo might be enough for you. However even the setup I have now (amd 6300 7970ghz) is not enough for me I still fall below 60 fps on fc3 I hate it. Ill probably buy one more 7970 :-). Everyone is different
a b à CPUs
February 2, 2013 11:17:36 PM



look at the top part, not the "cpu" ... one fetch and one decode. The end result:

Quote:
The firm claims a Bulldozer module can achieve 80% of the performance of two complete cores of the same capability.
http://techreport.com/review/19514/amd-bulldozer-archit...


Its not a full 2-cores per module, performance sufferes, so its not true "dual core" performance. Heck, in order for windows to schedule it better, windows sees it as a 4 core 8 thread cpu now.
February 3, 2013 4:25:25 AM

stantheman123 said:
30fps is NO WAY PLAYABLE for me. Even on farcry 3 fully maxed I still feel when I hit 50fps. I. Hate playing on lower than 60 fps and I hate lowering the graphics. A core 2 Duo might be enough for you. However even the setup I have now (amd 6300 7970ghz) is not enough for me I still fall below 60 fps on fc3 I hate it. Ill probably buy one more 7970 :-). Everyone is different


Let see, for the past, i don't even remember how long, 30fps (or 29.76 fps) is the standard in video, film making and graphics, that applies to the games graphics too, because for as long as TVs or monitors, CRT< LCD< LED< PLASMA whatever refreshes at max. 75 hertz, so as long as that is the standard, nothing above 60 fps is detectable by human eyes, those who say they play a game at 120 fps are fooling themselves. If 30 fps doesn't work for you in games and the jitter and 'low' quality is so unbearable to you, then you should probably stop watching movies and animation, altogether, because they all still shot, edited and played at 30 fps at tv and 24 fps at theater. Companies advertise these grand figures that we have accepted to believe to stay in the fray, to be cool, please, don't fall for that.

stantheman12, i have an i5 ivy bridge, i play absolutely no games at all, not even pocket tanks and Mario 3d, but i know many who do play all those high end games you all talk about (yeah, i confess, i don't even remember their names), i know people who do that professionally, even those who design them and some who play them still on core 2 duo, looks at games minimum requirements, please. It is not that everybody buys PC to play games. Some people actually work on them.
a c 152 à CPUs
February 3, 2013 4:41:33 AM

austenwhd said:
Let see, for the past, i don't even remember how long, 30fps (or 29.76 fps) is the standard in video, film making and graphics, that applies to the games graphics too, because for as long as TVs or monitors, CRT< LCD< LED< PLASMA whatever refreshes at max. 75 hertz, so as long as that is the standard, nothing above 60 fps is detectable by human eyes, those who say they play a game at 120 fps are fooling themselves. If 30 fps doesn't work for you in games and the jitter and 'low' quality is so unbearable to you, then you should probably stop watching movies and animation, altogether, because they all still shot, edited and played at 30 fps at tv and 24 fps at theater. Companies advertise these grand figures that we have accepted to believe to stay in the fray, to be cool, please, don't fall for that.

stantheman12, i have an i5 ivy bridge, i play absolutely no games at all, not even pocket tanks and Mario 3d, but i know many who do play all those high end games you all talk about (yeah, i confess, i don't even remember their names), i know people who do that professionally, even those who design them and some who play them still on core 2 duo, looks at games minimum requirements, please. It is not that everybody buys PC to play games. Some people actually work on them.


That's a complete mythand It started from old 34fps movie reels - the rough minimum number of frames a second required for the human brain to see a set of images as fluid motion.

The human eye can see WAY more than 30, 60, or 75 frames a second. Saying otherwise is simply wrong.
In FPS games I can easily tell the difference between less than 60 FPS or more than FPS. Shooting, shooting while moving and moving in genral is alot smoother and faster with higher FPS.

a b à CPUs
February 3, 2013 4:49:24 AM

austenwhd said:
Let see, for the past, i don't even remember how long, 30fps (or 29.76 fps) is the standard in video, film making and graphics, that applies to the games graphics too, because for as long as TVs or monitors, CRT< LCD< LED< PLASMA whatever refreshes at max. 75 hertz, so as long as that is the standard, nothing above 60 fps is detectable by human eyes, those who say they play a game at 120 fps are fooling themselves. If 30 fps doesn't work for you in games and the jitter and 'low' quality is so unbearable to you, then you should probably stop watching movies and animation, altogether, because they all still shot, edited and played at 30 fps at tv and 24 fps at theater. Companies advertise these grand figures that we have accepted to believe to stay in the fray, to be cool, please, don't fall for that.

stantheman12, i have an i5 ivy bridge, i play absolutely no games at all, not even pocket tanks and Mario 3d, but i know many who do play all those high end games you all talk about (yeah, i confess, i don't even remember their names), i know people who do that professionally, even those who design them and some who play them still on core 2 duo, looks at games minimum requirements, please. It is not that everybody buys PC to play games. Some people actually work on them.


30fps feels like utter crap to me. I notice the difference between the 50 and 60fps. The difference between 30 and 60fps is like day and night for ME. I dont watch movies/ animation So your point is invalid Im a gamer always have been always will be. If i cant play a game with over 50fps on the max settings or atleast high. I would rather NOT play it at all. Because i believe games are ment to be enjoyed on max details so you can appreciate how the game looks.

Is some core 2 duo and a 6xxx series gpu going to get me 60fps+ 1920x1080 Fully maxed+MSSA? No.
February 3, 2013 5:49:42 AM

stantheman123 said:
30fps feels like utter crap to me. I notice the difference between the 50 and 60fps. The difference between 30 and 60fps is like day and night for ME. I dont watch movies/ animation So your point is invalid Im a gamer always have been always will be. If i cant play a game with over 50fps on the max settings or atleast high. I would rather NOT play it at all. Because i believe games are ment to be enjoyed on max details so you can appreciate how the game looks.

Is some core 2 duo and a 6xxx series gpu going to get me 60fps+ 1920x1080 Fully maxed+MSSA? No.



"Is some core 2 duo and a 6xxx series gpu going to get me 60fps+ 1920x1080 Fully maxed+MSSA? No." Yeah right, just like i said. No, c2d and 6xxx can't do that, but c2d and GTX 6xx can. I am saying 'bare minimum' here remember that.

Remember those classic 8 bit, 2d graphics games of 80's. It was not always about the FPS or 3d, games use to be played to be enjoyed not to show off. 30 fps, even by industry standards, is at least acceptable enough. I know a lot (i mean, more than 100) of people who play at c2d at reduced 30 fps and are just as happy. You must be talking about absolutely (read ridiculously) demanding games, well, can i have your pc configuration, please.

rds1220, nice to have you on internet, how is your 'mythand' of '34fps' doing, your comment is rather interesting, almost informative, i won't comment on things you got wrong there, movies since black and white days could've been shot at 60 fps, it was possible, but the price of film was/is too much so they shot at 24fps instead, the minimum fps to create the illusion of motion, now i do believe 48fps and 60 fps, are the next standard in motion graphics, but that is still in the future, as movies and animation is still, almost 99% times are shot/created and edited/designed at bare minimum 24fps for movies and 30 fps for games (because otherwise rendering double the frames will cost twice as much, plus the time spent). The Hobbits, Adventure of Tintin was shot at 48 fps. Avatar 2 will be shot in 48 fps, maybe 60 fps!

While playing any of the two medium, it simply duplicate the already present frames and produce images at higher frame rates, still, that does not makes it, true 60fps, games use better algorithm/ programming for replicating frames than DVD or Blu ray, graphics cards do that for you, not the CPU, not the iGPU, and the motion looks super fluid, almost lifelike.

At, a refresh rate of 75 hertz on TV/monitor, a human eye cannot detect any deference between anything that goes above 60 fps, try Wikipedia for that.
a b à CPUs
February 3, 2013 6:03:15 AM

30fps Might be ok for you and your friends however for me it is UNACCEPTABLE. I hate playing under 60fps. I notice it so much i wanna cutt my self. A core 2 duo Would not suffice for me. Like i said even my current setup is not enough for me. How is a core 2 duo gonna do it for me then?. Yes i am talking about demanding games. Farcry 3 That game fully maxed out WITH MSSA gets around 30-40fps and i hate that. It makes me feel slow and weak. And yet i have One of the fastest gpus on par with the 680. But i can fix this ill just go buy one more 7970.
A core 2 duo would not suffice for me it would just bottleneck me. And when your a enthusaist who doesn't want Bare minimum who wants the best You want hardware that will give you the fps you want

In all honesty i WISH i could play with 30fps So i didnt go spend Thousands of dollars just to get a 60fps on games like farcry 3 and crysis 3

Now i have to buy one more 7970

http://www.newegg.com/Product/Product.aspx?Item=N82E168...
and might as well get the 8 core since im doing crossfire

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

$600 bucks to get 60fps :D 
February 3, 2013 6:50:10 AM

stantheman123, this conversation started with twelve25 asking why AMD 8350 is called a four core PC, while the company claims it to be a 8 core, to which many 'relevant' replies came and i think it was concluded that it has four physical cores and two logicals, blah, blah, blah... to which i tried to demonstrate how high-end Intel 8 core use the same technology and get away with it, then i discussed the possibility of people buying AMD 8xxx instead of Intel i7, almost same performance (read config) at almost a difference of $100.

Now i am concentrating on that $ 100, that people pay for brand name, instead, i will buy a $100 GPU and combined together, that can beat 99% Intel i7's in gaming and whatever... so, then i talked about the over advertising of CPU's as 'good for gaming' and that brought me to talk about C2D performance combined with a 'decent enough' GPU, giving 'decent enough' performance, i never said anything against not buying a $1000 gaming PC, go on buy one, but i guess price brings down the ambition to real grounds here.

Phew, now stantheman123, i see you have an Amd 6300 4.5ghz (good one!), i never asked anyone to throw their 4/6/8 cores and buy C2D instead, i don't know where is all that coming from, i am saying that games 'can be played' even on lower performance CPU's at lower settings. Those who cannot afford 8 core will have to be content with what there 2 cores has to offer and those who have the 8 core will keep looking down at the others, feeling pity. Why? Does games really need 6 or 8 cores, no games i know ask for minimum 4 core CPU, even you admitted your 6 core lags, time for a GPU upgrade, not a CPU.

You get what you paid for. We all agree to that right, so, back to where i started, i say make smarter choice, go for AMD 8xxx, if you're going to pay anything below 200 $, rest assured you are getting best CPU at your budget, someday in next 10 years, maybe, they will make 8 core intensive games, that's unlikely though, because companies are moving towards GPUs.

Anything above 200 $ seems a bit excessive to me if gaming is all you are going to do with a CPU. Instead, buy any four core (literally any!) but make sure to buy the latest GPU. That's all.
February 3, 2013 7:04:53 AM

stantheman123 said:

In all honesty i WISH i could play with 30fps So i didnt go spend Thousands of dollars just to get a 60fps on games like farcry 3 and crysis 3

Now i have to buy one more 7970

http://www.newegg.com/Product/Product.aspx?Item=N82E168...
and might as well get the 8 core since im doing crossfire

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

$600 bucks to get 60fps :D 


Seriously, your CPU is fine, buy a faster GPU instead, save $200
a b à CPUs
February 3, 2013 7:08:04 AM

austenwhd said:
Seriously, your CPU is fine, buy a faster GPU instead, save $200


If only i could get a faster gpu. Since the 7970/680 are the fastest gpus out. Either i can buy one more 7970 and deal with crossfire drivers and heat! oh yes. Or i could wait for the gtx 7xx or amd 8xxx

Yea true my cpu should be fine with 2 7970s thanks!

Also i do agree that core 2 duo is enough for gaming

i do agree that pretty much any quad core under 200 is enough for gaming. 4300/phenom x4/4100/ and you got the 6 cores like the 6300 and the 8xxx ofc too.

i did notice though that farcry 3 was using all my 6 cores but really does not matter since the game runs perfectly on a quad
February 3, 2013 7:20:10 AM

If you are really going to spend $ 400 on GPU, then rest assured with your CPU and that priced GPU (oh yes, buy as latest one as possible), you can play any game you want. Some games do utilize 8 cores, but that does not mean that they really need all 8 cores - at 100% - at all times to run.
February 3, 2013 1:22:56 PM

Coming back to the topic, twelve25, usually in BIOS, it shows 8 cores for 8350, you can disable some of them, enable some, that is how real they are. In Intel i7 8 core CPUs, there are only 4 cores visible! So as long as 'real 8 core' is concern FX 8350 wins, big time.

This should've been part of my earlier post, where i compared prices and all but... never mind. For all i understand, if number of cores is all you need buy a AMD Opteron 6272 in $564.99 with 8 physical cores and two pair logical each.
February 3, 2013 1:49:30 PM

Isn't debating the number of cores on an AMD chip like arguing about how to cut a moldy pizza? Whether you cut it into 4 or 8 slices, it's going to taste bad anyhow...
February 3, 2013 1:54:30 PM

I am not over enthusiastic-performance at whatever price freak, so i say an 8 core AMD is a good choice for lesser than $ 200 budget. I will go as far as saying AMD FX 8350 is the world's fastest high-end cpu in it's price range" go figure.

But then i can only say as much.
December 11, 2013 1:47:59 PM

The fx 8350 and 8320 are using 4 real cores and every core has 2 models so there are actually 8 virtual cores that communicate with each other.The i5 4670k quad core on the other hand is better in tasks that use only one core because its cores arent separated in 2 equal pieces such as the 8350.So you are paying more for nothing if you are only interested in gaming.
December 11, 2013 8:52:19 PM

First of all, wow, this thread is still active. Second;y, again, for gaming, what about a AIO configuration. Why do people on Tomshardware are so stuck on gaming PCs, aren't the high end category systems also include the 3D editor, desktop publishing or web developers. Anybody looking for heavy handed threaded performance can actually go for an AMD FX instead of an i5, for those task, under a extremely congested budget.

Beside, now, that i think of it, Intel is building CPUs for general use up until the i5 CPU, just like NVIDIA is doing with their GPU in GTX products. They are not best in class because of performance but because of non-availability of software support.
!