Sign in with
Sign up | Sign in
Your question

Intel and AMD performance delta

Last response: in CPUs
Share
February 1, 2013 1:21:51 AM

I have always just looked at numbers and graphs which told me which brand and which CPU at a given time is better. But I'd love to know exactly how and why the gaps between the architectures exist(not that I'm saying they should be the same). I'm not looking for a long explantion in this thread(but is welcomed). I am however looking for someplace that does have exhaustive explanations and reading material available. I just feel after years and years of just accepting numbers and being able to rattle them off in arguements to prove a point is kind of shallow, and want to know what makes those numbers.
February 1, 2013 2:51:11 AM

Well right now amd's timing on their cpus are off(they tried blaming software for a a while) meanwhile intel gets it right. Also amd shares floating points, basically cpu resources while intel doesnt.
m
0
l
February 1, 2013 2:54:15 AM

Im not actually completely sure what it all means in detail, but from a rough standpoint it's this-amd tried being cheaper, it didn't work out and now they are behind and dont want to spend the money and time trying to catch up.(and by cheaper I mean it in a good way, so the cpu's were more affordable)
m
0
l
Related resources
February 1, 2013 3:51:14 AM

There are a couple articles on toms where I got that information. I can get them if you want.
m
0
l
February 1, 2013 4:06:34 AM

so far the fx 8350 beats the i5 in gaming or ties it, its deffintley better at video editing and recording gameplay. for 200 dollars the fx 8350 is THEE best cpu to get right now. if ur a gamer get the 8350. i wouldnt settle for 4 cours thats the bare minimum you should have these days. alot of benchamrks are biased and flat out wrong. intel only beats amd if ur ur playing games at low resolutions like 800 by 600 which noone uses.
m
0
l
February 1, 2013 4:14:28 AM

And 4 cores is what most games will use(most engines arent optimized to use more, blame consoles, but with any program it has to be written to use that many cores, editing/modeling/development programs are mostly what use more than 4)
m
0
l
February 1, 2013 4:23:18 AM

uh you have alot to learn about this gen of cpus, heres proof the 8350 is better for gaming then an i5
http://www.youtube.com/watch?v=4et7kDGSRfc
civilization uses 6 cores, alot of games use mroe then 4 cores and newer gamings coming are going to use more then 4. multithread is the best to get right now and for the future.
m
0
l
February 1, 2013 4:40:18 AM

I love how tom's hardware articles and benchmarks go completely against what that youtube video shows.(well from both sources there are like 1 or 2 tests that stick out but like 90% of the video claims amd wins, while 90% of tom's hardwares results claim intel wins)
m
0
l
a b à CPUs
February 1, 2013 5:05:35 AM

In the real world the diffrence is 1-10fps and both are easily achieving over 60 fps So yea... Just go for the cheaper option. Also some games do use more than 4 cores

here is far cry 3 on my 6core amd

http://imgur.com/N3gCEmV

I can also take some screen shots of battlefield 3 and skyrim cpu core usage if you like
m
0
l
February 1, 2013 5:10:47 AM

Oh 30% usage. Great anyways I could bring up things like how amd loses by a lot in single cored processes and uses way more power overall, but you can't argue with price. Although with some of these the difference is 20$(or 120$)
m
0
l
a b à CPUs
February 1, 2013 5:14:45 AM

Yea i know about power and i know about the single core perfomance. However power is a trival matter some care some dont. I personally couldnt care less about it
Also games use more than 2 cores so single core perfomance doesnt matter really in gaming. In the end Both cpus will get you a Very nice gaming experience and imo you cant go wrong with either. And yes the difference between a 8350 and a i5 3570k i bealive is only $20 however the diffrence between a 6300 and a i5 3570k is $100 And the 6300 will perform the same as the 8350 in gaming. Since its the same cpu with one module not active

I personally wouldnt pay a $100 dollar preimium for 1-10fps When im getting over 60fps. This is just me though everyone is different
m
0
l
a c 103 à CPUs
February 1, 2013 5:16:54 AM

the905 said:
uh you have alot to learn about this gen of cpus, heres proof the 8350 is better for gaming then an i5
http://www.youtube.com/watch?v=4et7kDGSRfc
civilization uses 6 cores, alot of games use mroe then 4 cores and newer gamings coming are going to use more then 4. multithread is the best to get right now and for the future.


You quote one youtube video that's been bouncing around for a while... and it's your only "proof."

I could find you other youtube videos that do the exact same thing that "prove" that the i5 is better.

Just saying... blind fanboyism isn't a way to convince someone, especially not with your spelling and grammar the way it is.
m
0
l
February 1, 2013 5:45:25 AM

stantheman123 said:
Yea i know about power and i know about the single core perfomance. However power is a trival matter some care some dont. I personally couldnt care less about it
Also games use more than 2 cores so single core perfomance doesnt matter really in gaming. In the end Both cpus will get you a Very nice gaming experience and imo you cant go wrong with either. And yes the difference between a 8350 and a i5 3570k i bealive is only $20 however the diffrence between a 6300 and a i5 3570k is $100 And the 6300 will perform the same as the 8350 in gaming. Since its the same cpu with one module not active

I personally wouldnt pay a $100 dollar preimium for 1-10fps When im getting over 60fps. This is just me though everyone is different

It's 2 less cores, less mhz, and less cache. I think the difference from a 3570k and a fx 6300 would be more than 10 fps (especially when overclocking) but yes, 100$ is alot to pay. But at higher resolutions, and when taking into account the performance years from now it doesnt seem so bad.
m
0
l
a c 186 à CPUs
a b å Intel
February 1, 2013 5:56:20 AM

Where's jaguar when ya need him? :p 
m
0
l
a b à CPUs
February 1, 2013 6:15:46 AM

zycuda said:
It's 2 less cores, less mhz, and less cache. I think the difference from a 3570k and a fx 6300 would be more than 10 fps (especially when overclocking) but yes, 100$ is alot to pay. But at higher resolutions, and when taking into account the performance years from now it doesnt seem so bad.



In years? I upgrade very often. I dont keep a cpu for years im a enthusiast and the higher the resolution the less cpu intensive

http://media.bestofmicro.com/Y/S/357652/original/skyrim...

http://media.bestofmicro.com/Z/3/357663/original/world%...

http://media.bestofmicro.com/Z/4/357664/original/battle...

those are all 2560x1440

here is 1920x1080

http://media.bestofmicro.com/X/V/357619/original/battle...

http://media.bestofmicro.com/Z/3/357663/original/world%...

http://media.bestofmicro.com/Y/R/357651/original/skyrim...

http://images.anandtech.com/graphs/graph6396/51124.png

http://images.anandtech.com/graphs/graph6396/51138.png

http://images.anandtech.com/graphs/graph6396/51139.png

http://images.anandtech.com/graphs/graph6396/51140.png

The biggest gap is in skyrim and there the fx still achieves over 60fps.

m
0
l
February 1, 2013 6:20:36 AM

zycuda said:
Oh 30% usage. Great anyways I could bring up things like how amd loses by a lot in single cored processes and uses way more power overall, but you can't argue with price. Although with some of these the difference is 20$(or 120$)


AMD doesnt use WAY more power not even close. to run an 8350 as opose to say a 3570k it will cost you 7 or 8 dollars a year. the i5 is 40 dollar more expensive so in 3 years the 8350 will cost you 24 dollars and thats still cheaper then the i5. you really dont know what your talking about . they even explained this in the video. but thats to much for you to handle you dont like the truth do you. what an intel fanboy
m
0
l
February 1, 2013 6:27:34 AM

also the i5 would probably cost about 5 dollars a year to run. so maybe in about 6 or 7 years ull save money on an i5... ya just get the amd 8350 killer deal
m
0
l
February 1, 2013 6:31:28 AM

stantheman123 said:
No its about 10fps. I swaped out mobos+cpu with my friend for a couple of days he had a core i7 2600k games were about 5-10fps higher

If you google fx 6300 benchmarks you can see that its 20 fps+ difference... look at techradar's and anandtech's benchmarks (not saying you didnt get 5, just doesn't seem likely)
m
0
l
a b à CPUs
February 1, 2013 6:35:52 AM

zycuda said:
If you google fx 6300 benchmarks you can see that its 20 fps+ difference... look at techradar's and anandtech's benchmarks (not saying you didnt get 5, just doesn't seem likely)


Those benchmarks are at very low resolutions that i dont play at. I play at 1920x1080 Only game that showed maybie a 20fps advantage was skyrim However my 6300 was still geting 60fps+ all the time anyway. I Dont really like benchmarks im more of a real world perfomance guy. and the diffrence at 1920x1080 In 95% of the games is around 10fps. Only games like skyrim and starcraft does the intels shine. however even the the fx still gets over 60fps so yea.
m
0
l
February 1, 2013 6:36:59 AM

the905 said:
AMD doesnt use WAY more power not even close. to run an 8350 as opose to say a 3570k it will cost you 7 or 8 dollars a year. the i5 is 40 dollar more expensive so in 3 years the 8350 will cost you 24 dollars and thats still cheaper then the i5. you really dont know what your talking about . they even explained this in the video. but thats to much for you to handle you dont like the truth do you. what an intel fanboy

He was talking about good multi core usage yet they were only using 30%, oh and go ahead and ignore all the facts and links ive posted and go off on power usage, why don't you.
m
0
l
February 1, 2013 6:47:37 AM


The fx 8350 when we were comparing the 6300? If you want to compare the 20$ difference of the 8350 and the 3570k with the 3570k winning?
m
0
l
a b à CPUs
February 1, 2013 6:54:19 AM

zycuda said:
He was talking about good multi core usage yet they were only using 30%, oh and go ahead and ignore all the facts and links ive posted and go off on power usage, why don't you.




http://i.imgur.com/9ZP51AW.jpg

Your looking at the wrong thing look at the lines in the black box. The overall cpu usage went down because i alt tabbed out of the game

the amd 6300 performs the same in gaming.. as the 8350 the 8350 just has more cores and might get a bit more fps due to its higher turbo.

Nvm

Anyway im done talking here If you dont understand the basics
m
0
l
February 1, 2013 7:02:00 AM

stantheman123 said:
http://i.imgur.com/9ZP51AW.jpg

Your looking at the wrong thing look at the lines in the black box. The overall cpu usage went down because i alt tabbed out of the game

the amd 6300 performs the same in gaming..

Anyway im done talking here If you dont understand the basics

Oh boy up 50%, look at gaming tests the 6300 doesnt perform the same either(maybe you dont get the basics of a reduced cpu performing worse) and then what? Run away...
m
0
l
a b à CPUs
February 1, 2013 7:04:47 AM

zycuda said:
Oh boy up 50%, look at gaming tests the 6300 doesnt perform the same either(maybe you dont get the basics of a reduced cpu performing worse) and then what? Run away...


What gaming test like i said Ive used a i7 2600k and a amd 6300 ive used both Have you? . I know the diffrence ive felt the difference . The difference is around 10fps i dont care about those 800x600 benchmarks.

I never said amd was faster i just think amds cpus are Good for gaming at a lower price point and that you Thats all! Seeya
m
0
l
February 1, 2013 7:14:46 AM

stantheman123 said:
What gaming test like i said Ive used a i7 2600k and a amd 6300 Have you?. The difference is around 10fps i dont care about those 800x600 benchmarks. Seeya apprentice One day you will learn dont worry :-)

Those 1680x1050 benchmarks you mean?
m
0
l
February 1, 2013 7:24:36 AM

stantheman123 said:
What gaming test like i said Ive used a i7 2600k and a amd 6300 ive used both Have you? . I know the diffrence ive felt the difference . The difference is around 10fps i dont care about those 800x600 benchmarks.

I never said amd was faster i just think amds cpus are Good for gaming at a lower price point and that you Thats all! Seeya

I didn't say intels were cheaper, my point being they are a bit better for a bit more(20$ over a 8350, but there's more than just gaming fps at stock that makes a 3570k worth the price from a fx 6300 like more overclockability/less power/better single core performance/pcie 3.0)
m
0
l
a b à CPUs
February 1, 2013 7:26:05 AM

Are these benchmarks (pictures) of the cpu's both overclocked or at stock...?

EDIT: The anandtech ones, I know Tom's posts speeds. I have a funny feeling the 3570k > 8350 when both are overclocked.
m
0
l
a b à CPUs
February 1, 2013 7:28:20 AM

zycuda said:
I didn't say intels were cheaper, my point being they are a bit better for a bit more(20$ over a 8350, but there's more than just gaming fps at stock that makes a 3570k worth the price from a fx 6300 like more overclockability/less power/better single core performance/pcie 3.0)


You have your opinion i have mine. Lets just agree on that.

I personally dont think $100 dollars worth it from a 6300 to i5 3570k You think it is

Each to there own.

Gooday.
m
0
l
February 1, 2013 7:42:42 AM

stantheman123 said:
You have your opinion i have mine. Lets just agree on that.

I personally dont think $100 dollars worth it from a 6300 to i5 3570k You think it is

Each to there own.

Gooday.

Yes, to each there own, one last thing though, you said you upgrade very often(i'm going to guess 400-700$ a year spent, correct me if i'm wrong, but I would think an extra 100$ spent isn't that much if you upgrade alot already. That's how I look at it though.) Anyways you already bought your 6300, I'll say it again, to each there own and good day likewise.
m
0
l
a b à CPUs
February 1, 2013 7:53:53 AM

zycuda said:
Yes, to each there own, one last thing though, you said you upgrade very often(i'm going to guess 400-700$ a year spent, correct me if i'm wrong, but I would think an extra 100$ spent isn't that much if you upgrade alot already. That's how I look at it though.) Anyways you already bought your 6300, I'll say it again, to each there own and good day likewise.


Well i upgrade every time a new gpu/cpu come out. i could of easily afforded a 3930k if i wanted too. However i also felt like trying/supporting amd THIS round. Next ill get steam roller. After steam roller. Depends maybie intel again or amd depending how i feel. i see your point though why not spend a extra $100? well 1st i did not want to buy a dead motherboard that will have no upgrade path 2nd i acctually felt like trying amd my self. If haswell was out. and was lets saying 20-30% better than piledriver i would of spent the extra money cause i would deem it worthy. Anyway each to there own.

Seeya mate.
m
0
l
February 1, 2013 8:01:26 AM

I imagine OP checking this thread tomorrow being amazed at the amount of replies
m
0
l
a c 480 à CPUs
a c 119 å Intel
February 1, 2013 5:22:42 PM

the905 said:
uh you have alot to learn about this gen of cpus, heres proof the 8350 is better for gaming then an i5
http://www.youtube.com/watch?v=4et7kDGSRfc
civilization uses 6 cores, alot of games use mroe then 4 cores and newer gamings coming are going to use more then 4. multithread is the best to get right now and for the future.



I didn't look at the video so I'll take your word for it.

Click the following link which comes the 8 core FX-8150 @ 3.6GHz vs. the dual core Intel Core i3-2100 @ 3.3GHz.

http://www.anandtech.com/bench/Product/434?vs=677

Yes... the FX-8150 is better than the i3-2100... the extra 6 cores and 300MHz higher clockspeed allows it to defeat the i3-2100 by 2.4 frames per second.

I would have liked to compare the more current FX-8350 and i3-3220 CPUs, but there are no CIV 5 benchmarks.

MOAR is better!!!!!
m
0
l
February 2, 2013 5:04:39 AM

zycuda said:
He was talking about good multi core usage yet they were only using 30%, oh and go ahead and ignore all the facts and links ive posted and go off on power usage, why don't you.


that didnt make any sense.
m
0
l
!