Sign in with
Sign up | Sign in
Your question

AMD and Intel CPU Research Questions

Last response: in CPUs
Share
September 17, 2012 5:00:03 PM

I am saving $2000 to build my first gaming rig sometime between January and June of next year. I will use it mainly for gaming but I will also to listen to FLAC files. I am not brand specific, because I am more focused on reliability. I have heard a lot of AMD vs Intel debates and am trying to avoid it in this thread. My previous desktop PC platforms for gaming used Intel's Pentium II, Pentium III, and Pentium 4. My previous Dell XPS laptop used the Intel Core2 Duo and my current HP Pavilion laptop uses the Intel Core i7-2670QM. I have looked at both the AMD FX 8150 (3.60 GHz with a Turbo Boost to 4.20 GHZ) and Intel Core i7-3770K (3.50 GHz and a Max Turbo to 3.90 GHz). Most people are saying Intel is the better choice. From my current perspective, the FX 8150 seems to be more powerful than the i7-3770K. Can someone explain to me what factors are involved in choosing either an AMD or Intel gaming CPU? How does one perform better than the other?
a c 102 à CPUs
a b 4 Gaming
a b À AMD
September 17, 2012 5:24:10 PM

Just read this and read the new one when you are buying a PC http://www.tomshardware.co.uk/gaming-cpu-review-overclo...
AMD FX series CPUs are just not good for gaming (and all others are dated) and also note the i7s are no better than i5s for gaming (except the 6 core i7s which cost a fortune for a small increase in performance in some games).
m
0
l
a c 139 à CPUs
a b 4 Gaming
a b å Intel
a b À AMD
September 17, 2012 6:00:02 PM

I'm not sure where you saw how the 8150 is better but it gets beaten by the 3770k in everything except for photoshop. You can gauge performance by benchmarks. Nothing to explain but to show real world performance. http://www.tomshardware.com/reviews/ivy-bridge-benchmar...
m
0
l
Related resources
a b à CPUs
a b À AMD
September 17, 2012 7:52:02 PM

Since newer chips will be out by then. Its best to decide then. Given your requirements, there isn't any reason to get the AMD cpu since mostly you'd be doing gaming. Intel's CPUs are currently much better for gaming than AMD's. The AMD chips may be cheaper and slightly faster in some other benchmarks but for gaming, its not worth it.
m
0
l
a c 471 à CPUs
a b 4 Gaming
a c 115 å Intel
a c 118 À AMD
September 18, 2012 12:44:33 AM

By then Intel's Haswell CPUs should be out. Not sure what the performance increase will be vs. Ivy Bridge though. However, since Ivy Bridge is more powerful than Phenom II / FX and the upcoming PileDriver CPUs. Unless PileDriver will somehow increase performance by 29% beyond Phenom II / FX to equal the Ivy Bridge CPUs.

I suppose the main objective of Haswell is to decrease power consumption further, but I'm sure there will be at least a small performance increase (5%+ ??) over Ivy Bridge. More details about performance increase should come out by Q1 2013.

In terms of overall reliability, it is hard to say which is more reliable... and buying budget components can mean lower reliability, but buying expensive components does not guarantee the components will last forever. It's possible a $300 motherboard will fail in 1 year while a $75 motherboard may last you 5 years or more.

My oldest AMD CPU is the Athlon XP-M 2600 which I used to build a home theater PC back in 2003. It still works, but I will be throwing it out before the end of this month. My oldest Intel CPU is a Pentium M 1.5GHz in my IBM ThinkPad notebook which I bought back in 2003. It still works.

Generally speaking, the more heat a PC generates the less reliable it can be. AMD CPUs consumes a lot of power and can generate a lot of heat. Intel CPUs on the other hand consumes less power and generally produces less heat. The exception is Ivy Bridge since the thermal paste within the CPU is actual paste instead of a solder-like (metallic) substance. Heat is not really a probably with an Ivy Bridge CPU unless you decide to overclock.

Note sure how much power AMD's upcoming PileDriver CPU will consume, but hopefully they can lower it. Ivy Bridge CPUs are generally rated at 77w TDP while AMD CPUs are 125w or 140w TDP (depending on exactly which one you are talking about). So.. AMD CPUs can be rather power hungry and less powerful than their Intel counterparts. At least they cost less.
m
0
l
a b à CPUs
September 18, 2012 11:45:43 AM

k1114 said:
I'm not sure where you saw how the 8150 is better but it gets beaten by the 3770k in everything except for photoshop. You can gauge performance by benchmarks. Nothing to explain but to show real world performance. http://www.tomshardware.com/reviews/ivy-bridge-benchmar...


Errr no, benchmarks are what Intel have used for years to create the impression that they are better....how do you think you get 80% marketshare. Its funny how synthetics never corrolate with real life results, and to say the least it is not very close at all. Synthetics do however accurately reflect Intel CPU's in numbers, basically written for Intel.

jaguarskx said:
Unless PileDriver will somehow increase performance by 29% beyond Phenom II / FX to equal the Ivy Bridge CPUs.I


Curious as to how that 29% number gets thrown around, almost as curious as those suggesting that Piledriver will only level Bloomfield. Thubans basically did that and overall Zambezi (FX 8XXX) are stronger, again benchmarks don't corrolate AMD real world performance well. Some changes like a integrated imc which is way faster than that of Phenom II yet benches don't corrolate, it in fact goes backwards. Synthetics are as the word implies....artificial, manufactured. Suffice to say no intel chip is 30% faster.
m
0
l
a c 139 à CPUs
a b 4 Gaming
a b å Intel
a b À AMD
September 18, 2012 5:01:02 PM

I never look at synthetics. Game benchmarks for game performance. How can this be "used to create the impression that they are better" if it is the game itself? Same goes for content creation/productivty benchys, they are the actual program showing real world performance.
m
0
l
a c 146 à CPUs
a b 4 Gaming
a b À AMD
September 18, 2012 10:38:38 PM

k1114 said:
I never look at synthetics. Game benchmarks for game performance. How can this be "used to create the impression that they are better" if it is the game itself? Same goes for content creation/productivty benchys, they are the actual program showing real world performance.


It's just an excuse for AMD fanboys to try and rationalize and justify buying inferior hardware. Next he'll be spouting off the rediculous Intel pay's everyone off consipracy garbage.

To the OP as everyone said for gaming Intel is far above AMD in terms of sheer CPU power and performance. Benchmarks show that Bulldozer falls behind Intel CPU's in all but the most heavily threaded programs.
m
0
l
a b à CPUs
September 19, 2012 6:10:47 AM

The naivity is strong in this one. Maybe synthetics were writen for the purpose of making the gullible ones feel justified in spending over the top to get similar performance. Which sadly has worked a charm in this case.
m
0
l
a c 185 à CPUs
a b 4 Gaming
a b å Intel
a b À AMD
September 19, 2012 6:13:07 AM

IF you have 2K go with Intel.
m
0
l
a b à CPUs
September 19, 2012 10:01:55 AM

i5 topped out system, for 2k Easy.
m
0
l
a b à CPUs
September 19, 2012 10:39:15 AM

k1114 said:
I never look at synthetics. Game benchmarks for game performance. How can this be "used to create the impression that they are better" if it is the game itself? Same goes for content creation/productivty benchys, they are the actual program showing real world performance.

And what if Intel helped to fund development of said games?

for $2000, how much are you putting into graphics?

http://www.tweaktown.com/articles/4438/core_i7_3960x_wi...

would be nice if they included the I5 there.
m
0
l
a b à CPUs
September 19, 2012 10:46:24 AM

noob2222 said:
And what if Intel helped to fund development of said games?

for $2000, how much are you putting into graphics?

http://www.tweaktown.com/articles/4438/core_i7_3960x_wi...



It is very true AMD, Nvidia and Intel have game studio partners, in the case of Nvidia and Intel many more, it also makes perfect sense for the partners to make games run more efficiently with the respective setups, but this is a fact overlooked by pure denialism, or forum trollism.

It is also not a secret that Intel started benchmarketing, its the easiest way to ensure you maintain overall majority market share, how subtle than to make your competitors products appear worse than the older generation.

Using Thubans, FX and SB everyday I can tell you that a FX chip is far more responsive than a Thuban, the IMC is significantly faster and BF3 gives me around 10FPS more, odd yes considering the lunacy in benchmarks submitted.
m
0
l
a c 146 à CPUs
a b 4 Gaming
a b À AMD
September 19, 2012 4:28:53 PM

sarinaide said:
The naivity is strong in this one. Maybe synthetics were writen for the purpose of making the gullible ones feel justified in spending over the top to get similar performance. Which sadly has worked a charm in this case.


O you mean kind of like how your foolish enough to buy into AMD's 8 core scam.
m
0
l
a b à CPUs
September 19, 2012 5:27:59 PM

http://techreport.com/review/23246/inside-the-second-ga...













Quote:
As you probably expected, the Ivy Bridge-derived processors are near the top in overall gaming performance. Intel has made incremental improvements over the Sandy Bridge equivalents in each price range, from the i5-2400 to the i5-2500K and i7-2600K. The Core i5-3470 offers perhaps the best combination of price and performance on the plot, and the Core i5-3570K offers a little more speed for a bit more money. The value curve turns harsh from there, though. The i7-3770K doesn't offer much of an improvement over the 3750K, yet it costs over a hundred bucks more. The Core i7-3960X offers another minuscule gain over the 3770K, but the premium to get there is over $500.

Ivy Bridge moves the ball forward, but Intel made even more performance progress in the transition from the prior-generation Lynnfield 45-nm processors—such as the Core i5-760 and i7-875K—to the 32-nm Sandy Bridge chips. From Sandy to Ivy, some of the potential speed benefits of the die shrink were absorbed by the reduction of the desktop processor power envelope from 95W to 77W.

Sadly, with Bulldozer, AMD has moved in the opposite direction. The Phenom II X4 980, with four "Stars" cores at 3.7GHz, remains AMD's best gaming processor to date. The FX-8150 is slower than the Phenom II X6 1100T, and the FX-6200 trails the X4 980 by a pretty wide margin. Only the FX-4170 represents an improvement from one generation to the next, and it costs more than the Phenom II X4 850 that it outperforms. Meanwhile, all of the FX processors remain 125W parts.

We don't like pointing out AMD's struggles any more than many of you like reading about them. It's worth reiterating here that the FX processors aren't hopeless for gaming—they just perform similarly to mid-range Intel processors from two generations ago. If you want competence, they may suffice, but if you desire glassy smooth frame delivery, you'd best look elsewhere. Our sense is that AMD desperately needs to improve its per-thread performance—through IPC gains, higher clock speeds, or both—before they'll have a truly desirable CPU to offer PC gamers.


So that's where the "2 generations ago" comes from..
m
0
l
a b à CPUs
September 19, 2012 9:48:58 PM

^ by only looking at a worst-case scenario and ignoring anything else.
m
0
l
a b à CPUs
September 19, 2012 11:12:05 PM

^ It's those "worst-case" lags that cause noticeable stuttering in the games tested. The human eye (particularly the rods responsible for black&white peripheral vision) is sensitive to movement, so excessive stuttering makes for p-p--po-po-poor game play, excuse my stutter :p ..

BTW Scott Wasson has a good tech reputation across the web..
m
0
l
a c 471 à CPUs
a b 4 Gaming
a c 115 å Intel
a c 118 À AMD
September 20, 2012 1:15:48 AM

sarinaide said:

Curious as to how that 29% number gets thrown around, almost as curious as those suggesting that Piledriver will only level Bloomfield. Thubans basically did that and overall Zambezi (FX 8XXX) are stronger, again benchmarks don't corrolate AMD real world performance well. Some changes like a integrated imc which is way faster than that of Phenom II yet benches don't corrolate, it in fact goes backwards. Synthetics are as the word implies....artificial, manufactured. Suffice to say no intel chip is 30% faster.


Generally speaking, Phenom II and FX are more or less the same. The FX can perform a little better in some specific benchmarks like Photoshop, video encoding and 3D rendering. However, Phenom II is generally a little better in games than the FX. Phenom II more or less performs just as well as Intel's Core 2 Duo/Quad CPUs. Therefore, the FX is also in the same boat as the Core 2 Duo/Quad. Again, there are some benchmarks that the FX will outperform Intel's older CPUs, but the overall average performance is nearly the same.

Intel's Clarkdale/Nehalem CPU cores (1st gen Core i3/i5/i7 CPUs) are on average 10% faster than the Core 2 family. That means 10% faster than Phenom II and FX.

Intel Sandy Bridge CPUs are on average 12% faster than Clarkdale/Nehalem.

Intel Ivy Bridge CPUs are on average 5% faster than Sandy Bridge.

Phenom II = FX = Core 2 Family = 100%

Clarkdale/Nehalem = 10% faster than Core 2 Family = 100% * 1.1 = 110%

Sandy Bridge = 12% faster than Clarkdale/Nehalem = 110% * 1.12 = 123.2%

Ivy Bridge = 5% faster than Sandy Bridge = 123.2 *1.05 = 129.36%

Ivy Bridge = 129.36% faster than Phenom II / FX / Core 2 Duo/Quad
m
0
l
a b à CPUs
September 20, 2012 6:23:38 AM

I helped out a friend whom had issues with his i7 930 build so I donated a Crosshair V and 1100T to him, the synthetics beat his overclocked 930........I really don't know where you get Phenom II and FX at Core2 level but anyways I am not really keen to find out either.

As for stutter spike the only two causes I had with that was a) Ping and b) GTX 560ti SLI stuttered alot. Skyrim has a 60hz lock, you need to mod the game to break that lock, most AMD processors with the right graphics card hit the 60hz barrier so its pointless.

2500K with GTX 450 vs 2500K with 7970....if CPU dependance was the factor then the results will be the same, fact is games are GPU dependant.
m
0
l
a b à CPUs
September 20, 2012 9:20:18 PM

sarinaide said:
As for stutter spike the only two causes I had with that was a) Ping and b) GTX 560ti SLI stuttered alot. Skyrim has a 60hz lock, you need to mod the game to break that lock, most AMD processors with the right graphics card hit the 60hz barrier so its pointless.


And to think just a few years ago, AMD fans touted how much "smoother" K8 was than Conroe while gaming :p .. Now that the shoe is on the other foot, suddenly it's "pointless".

Quote:
2500K with GTX 450 vs 2500K with 7970....if CPU dependance was the factor then the results will be the same, fact is games are GPU dependant.


On the first page of the article it says they used the exact same setup including identical GPUs for each CPU tested, and no the results are nowhere near the same.

BTW, the same article shows similar results when testing Batman: Arkham City and BF3, so no it's not just some peculiarity of Skyrim's internal VSync-type feature..
m
0
l
a b à CPUs
September 20, 2012 9:37:03 PM

I guess its whatever game you choose, Intel has the advantage in BF3, Crysis2 and Batman, but AMD chips do very well in Metro 2033, Mafia II, Dragon Age and a few others. Which one to believe?

m
0
l
a b à CPUs
September 20, 2012 10:02:10 PM

jaguarskx said:
Generally speaking, Phenom II and FX are more or less the same. The FX can perform a little better in some specific benchmarks like Photoshop, video encoding and 3D rendering. However, Phenom II is generally a little better in games than the FX. Phenom II more or less performs just as well as Intel's Core 2 Duo/Quad CPUs. Therefore, the FX is also in the same boat as the Core 2 Duo/Quad. Again, there are some benchmarks that the FX will outperform Intel's older CPUs, but the overall average performance is nearly the same.

Intel's Clarkdale/Nehalem CPU cores (1st gen Core i3/i5/i7 CPUs) are on average 10% faster than the Core 2 family. That means 10% faster than Phenom II and FX.

Intel Sandy Bridge CPUs are on average 12% faster than Clarkdale/Nehalem.

Intel Ivy Bridge CPUs are on average 5% faster than Sandy Bridge.

Phenom II = FX = Core 2 Family = 100%

Clarkdale/Nehalem = 10% faster than Core 2 Family = 100% * 1.1 = 110%

Sandy Bridge = 12% faster than Clarkdale/Nehalem = 110% * 1.12 = 123.2%

Ivy Bridge = 5% faster than Sandy Bridge = 123.2 *1.05 = 129.36%

Ivy Bridge = 129.36% faster than Phenom II / FX / Core 2 Duo/Quad

your numbers are a bit off. http://ixbtlabs.com/articles3/cpu/intel-ci7-123gen-p3.h...

Ivy is only 12% over nahalem (10% on the high end wth HT)

The problem is as software gets updated, its not retested with the old hardware becasue 1) it takes too long, or 2) they got rid of it already. sure, at its release, nahalem was ~10% over core 2, what about after the software is updated on core 2 q?

This is where synthetic tests come in, they don't vary with updates because there isn't much to tweak and the reason they don't reflect real application/game performance. there are no optimizations to change over time.
m
0
l
a b à CPUs
September 20, 2012 11:53:48 PM

sarinaide said:
I guess its whatever game you choose, Intel has the advantage in BF3, Crysis2 and Batman, but AMD chips do very well in Metro 2033, Mafia II, Dragon Age and a few others. Which one to believe?


You mean fps or stuttering? AFAIK Scott Wasson's article is the only one investigating stuttering vs. CPU choice. While gameplay is good with 60+ fps average, smoothness is also a consideration, as lags and stutter can be distracting. Hopefully he'll have a later article with more CPUs and games tested..

From AT:



m
0
l
a b à CPUs
September 21, 2012 6:02:31 AM

After running enough test runs on the 8150, at 3.6ghz the IPC penalties are quite severe, compared to 4ghz where in take Cinebench it jumps over a full point above a i5 and just behind a i7, considering the intended highest end Zambezi's were intended to be 2 billion transistors deep running at 4ghz up to 4.5ghz on boost and with far reaching overclocking potential, I will do the nvidia thing and call Zambezi a Half Bulldozer, which when you look at how badly skimped down the specs where due to GF not perfecting the 32nm process it is very much a plan B.

So if we take a FX 8150 at 4ghz and it gives in between i5 and i7 results, it is not half as bad as made out to be, and very far from what it was intended to be. Vs its direct replacement the Thubans.

1] IMC sub 10s in MaxXMem admittedly with only 4GB of DDR3 1333 (waiting to recieve AMD branded modules to test soon and this was all I had on me) that itself is very impressive considering a 4ghz 1100T does like 22s.

2] IPC, considering that a higher clock speed would have mitigated the loss of IPC, compare 3.6 to 4ghz is over a point in Cinebench and 200marks more in 3DM11, I will honestly say nothing was gained or lost, the 1100T still remains a fantastic processor in light of all this, running my 1100T vs my 8150 it seems to be the same thing on aggregate.

3] Power control and efficiency. A thuban needs 1.4+ v to hold stable at 4Ghz, 1.280v to hold stable at stock. The FX is far better on voltage stability, While it loses control at high overclocks the FX is still day to day the more efficient chip in that regard. FX running 4.4ghz at 1.315v stable, 4ghz 1100T 1.425v.

Fast forward to PD, achieved clock speed intended, Resonant clock mesh as intended, Bolt on instruction sets, improvements to front end and memory controller notably fixes to latency which should help micro stutters, but still well down on transistor counts.

Overall Zambezi is not as bad as made out, but not as good as intended...but most definitely far from 20/30/50% numbers I have heard.

I have only encountered stuttering at extremely high graphic settings with the fastest GPU's on the market, and multi GPU's as well, starts to get noticable at GTX670/HD7950 level.
m
0
l
a b à CPUs
September 21, 2012 4:28:14 PM

^ Of course, SB and IB can be oc'd a bit as well. But FX8150 going from what - 3.6? to 4.0GHz is just 10% speed bump. If it does show a higher increase than 10% in Cinebench multithreaded, then it's probably due to the front end being better able to keep the pipes filled.

Just out of curiousity, what temps and Vcore does the 8150 generate at 4GHz under Prime95 loads? We already know the power draw increases much more rapidly with oc frequency at load and idle, than just about any other chip on the market, so that indicates a big spike in Vcore needed to remain stable.
m
0
l
a b à CPUs
September 21, 2012 6:52:16 PM

Vcore isn't needed to remain stable, its just easier to fake the numbers.

Im at 1.344V and running 4.7 ghz. Stock turbo v-core is 1.412V Im under-volted and overclocked, but I have no way to test power draw.

you have to tweak the other voltages, not just crank the v-core and post stupid numbers because you did a stupid overclock.
m
0
l
a b à CPUs
September 21, 2012 7:40:08 PM

fazers_on_stun said:
^ Of course, SB and IB can be oc'd a bit as well. But FX8150 going from what - 3.6? to 4.0GHz is just 10% speed bump. If it does show a higher increase than 10% in Cinebench multithreaded, then it's probably due to the front end being better able to keep the pipes filled.

Just out of curiousity, what temps and Vcore does the 8150 generate at 4GHz under Prime95 loads? We already know the power draw increases much more rapidly with oc frequency at load and idle, than just about any other chip on the market, so that indicates a big spike in Vcore needed to remain stable.


I have been able to run stable prime 95 at around 1.315v ~=-5%, with my cooling its around 33* idle and mid-high 40* under full load. It was the initial BD blue print to run at 4ghz all models with up to 4.5 TBoost, with 2 billion transistors intended in the FX 8XXX series it should have easily been able to run overclocks stable beyond the 5ghz barrier. Ideally the rated benches should be at 4ghz, at those clocks the penalty from branch miss-predicts are substantially less. Some would argue it is unfair, but it wouldn't if that was intended stock. On a engineering level it would have been a feat on its own, I would hazard a guess that Piledriver is Zambezi full, but way to late, then again has GF perfected the 32nm process now.
m
0
l
a b à CPUs
a b 4 Gaming
September 21, 2012 8:04:06 PM

Lol you were supposed to avoid this
m
0
l
a b à CPUs
September 22, 2012 5:25:52 PM

^ Think the OP meant AMD vs. Intel flamewars. So far this thread has been informative and civilized, right?? :D 

Come to think of it, the OP seems to have disappeared.. :p 
m
0
l
September 25, 2012 4:35:28 AM

If I were you I wouldn't waste over half your budget on a cpu when the fx-8150 can do everything you want it to and more. Invest that savings into a gtx 690 graphics card and I can GUARANTEE you will play every game currently made and every game made for at least another year on the MAX settings easily.

fx-8150
gtx 690
1000 watt psu
nice case
ssd
8 gigs ram
ASUS mobo
m
0
l
a c 146 à CPUs
a b 4 Gaming
a b À AMD
September 25, 2012 4:57:05 AM

xa376 said:
If I were you I wouldn't waste over half your budget on a cpu when the fx-8150 can do everything you want it to and more. Invest that savings into a gtx 690 graphics card and I can GUARANTEE you will play every game currently made and every game made for at least another year on the MAX settings easily.

fx-8150
gtx 690
1000 watt psu
nice case
ssd
8 gigs ram
ASUS mobo


So wrong in so many ways. First off all yes the Bulldozer can run all games but not very good. The Phenom II and pretty much all recent Intel CPU's will out perform the Bulldozer. Secondly there is no need to get a GTX 690 it's a waste of 1000 dollars. The GTX 680 or 670 will do fine. Not to mention that at higher resolutions the Bulldozer will bottleneck higher end GPU's. There is no need to get a 1000 watt PSU either if you really need alot of power a good quality 750-800 Watt PSU is fine. If the main purpose is gaming there is no point in getting a crappy Bulldozer.
m
0
l
a c 185 à CPUs
a b 4 Gaming
a b å Intel
a b À AMD
September 25, 2012 5:35:59 AM

xa376 said:
If I were you I wouldn't waste over half your budget on a cpu when the fx-8150 can do everything you want it to and more. Invest that savings into a gtx 690 graphics card and I can GUARANTEE you will play every game currently made and every game made for at least another year on the MAX settings easily.

fx-8150
gtx 690
1000 watt psu
nice case
ssd
8 gigs ram
ASUS mobo

m
0
l
a c 146 à CPUs
a b 4 Gaming
a b À AMD
September 25, 2012 6:37:53 AM

More like

m
0
l
a b à CPUs
a b 4 Gaming
September 25, 2012 2:50:12 PM

fazers_on_stun said:
You mean fps or stuttering? AFAIK Scott Wasson's article is the only one investigating stuttering vs. CPU choice. While gameplay is good with 60+ fps average, smoothness is also a consideration, as lags and stutter can be distracting. Hopefully he'll have a later article with more CPUs and games tested..

From AT:

http://imageshack.us/a/img685/4306/atgamebench8150vs2500k.png


Exactly. I've suspected such a phenomina for years, and its about time someone bothered to look at frame latencies. Shows how IPC can have a significant impact on gameplay, even if FPS appears stable over a 1 second timespan. Thats where BD runs into problems.
m
0
l
a b à CPUs
September 25, 2012 3:06:02 PM

It reminds me of just how bad anandtechs benches have become, bordering on lunacy.
m
0
l
October 7, 2012 4:18:32 PM

noob2222 said:
And what if Intel helped to fund development of said games?

for $2000, how much are you putting into graphics?

http://www.tweaktown.com/articles/4438/core_i7_3960x_wi...

would be nice if they included the I5 there.



Sorry for the slow response. I am looking at a video card that uses DDR5 memory with a cost between $200 and 400. However, I am told that it is not easy to compare graphics card based on specs. I will look and see the reviews for graphics cards when I get ready to order the parts.
m
0
l
October 7, 2012 4:33:43 PM

fazers_on_stun said:
^ Think the OP meant AMD vs. Intel flamewars. So far this thread has been informative and civilized, right?? :D 

Come to think of it, the OP seems to have disappeared.. :p 

I haven't disappeared. I have been busy continuing my research on other components along with other personal matters.
m
0
l
a b à CPUs
October 7, 2012 5:50:41 PM

^ Well Piledriver should be out in about 3 weeks, if you decide to go AMD. I would suggest the 8350 instead of the 8150, but I seriously doubt it'll be enough of an improvement to get up to i5-2500K or i7-3770K levels in gaming.

My specc'd rig in my sig cost $2400, but that includes a 27" monitor and a 2.1 AL speaker system and Soundblaster recon3D. Assuming you don't need those, actual price including a HAF 932 case and 1KW OCZ Platinum PS and LG Bluray burner and 128GB Corsair SSD and a 600GB Raptor HD and three 2-TB storage drives, plus the listed stuff, was about $1800.

I would have gotten a 680 video card instead of the HD 7970 but got tired of waiting on NV's 'paper launch'.
m
0
l
a c 146 à CPUs
a b 4 Gaming
a b À AMD
October 7, 2012 5:57:58 PM

fazers_on_stun said:
^ Well Piledriver should be out in about 3 weeks, if you decide to go AMD. I would suggest the 8350 instead of the 8150, but I seriously doubt it'll be enough of an improvement to get up to i5-2500K or i7-3770K levels in gaming.

My specc'd rig in my sig cost $2400, but that includes a 27" monitor and a 2.1 AL speaker system and Soundblaster recon3D. Assuming you don't need those, actual price including a HAF 932 case and 1KW OCZ Platinum PS and LG Bluray burner and 128GB Corsair SSD and a 600GB Raptor HD and three 2-TB storage drives, plus the listed stuff, was about $1800.

I would have gotten a 680 video card instead of the HD 7970 but got tired of waiting on NV's 'paper launch'.


Probably not. Even if the Piledriver has the performance increase AMD is claiming it still will put it at the level of the first generation I core processors. That would still be pretty far behind Intel.
m
0
l
October 8, 2012 3:07:12 AM

Well I am now leaning toward Intel. I am probably going to hold off until Haswell is released in March or April of next year, unless by some miracle AMD's FX 8350 outperforms the Intel i5-3570k.
m
0
l
!