One of three choices - upgrade - gaming

lorgot

Honorable
Nov 20, 2012
36
0
10,530
Hello to all

I had posted before but wasn't quite clear. I'm sorry for that. I'm upgrading my PC for gaming purposes. After reading some tips on what to post, here it goes.

I'm upgrading it because I'm having FPS issues. The games that I've had problems so far were FPSs like Battlefield 3, Medal of Honor Warfighter and Black Ops II. Strangely enough, games like Starcraft II and Dawn of War 2 run smooth enough on high or very high graphics options. BF3 didn't run smooth even on low and Black Ops II is all set to medium-high but I have like 30FPS average. My PC is currently working on a Core 2 Duo E8400, a GeForce GTX280 O.C (I think It's EVGA), 2GB DDR2 800Mhz, an Asus P5NE-SLI, a 250GB Sata II Samsung 7200RPM and a 750W PSU (can't remember the brand). So, considering the amount of RAM and processor required today, I figured that I had to upgrade both.

My GPU is quite strong, I think. I live in Brazil, so that's why it's hard to transpose values like $300 or $600 budget. We have lots of taxes and such so everything is way more expensive than in the US or Europe. But let's say that I have a $500 budget and what I could find on the best (reliable and cheap) store here on that budget were three options, listed below. I researched the website Cpu Benchmark and an article about CPU Gaming Hierarchy here on Tom's Hardware. The below are mobos and processors choices. The best memory I found was the Corsair 4GB DDR 1333MHz, which I'm getting two to run on Dual Channel making it a 8GB RAM.

ASRock N68-S3 FX AM3 + Bulldozer FX-6100 Black Edition

Asus M4A77T/USB3 + Phenom II X4 965 Black Edition

Intel H61M-HVS + Intel Core i3-2120

Points to consider:
a) I don't know how to overclock things, so the best option for me would be the best 'standard' processor.
b) I heard people saying that the i3 is better than the FX-6100. But doesn't that 'memory cache L3' interfere on that? I saw that the i3 has 3MB and the FX has 8MB.
c) Considering that my strong point is the GTX280, which one is the best to not bottleneck it?
d) I really can't surpass the budget or change options so it's gotta be one of those three.
e) I have a 120mm fan on the front and a smaller one on the back. Will I need more cooling? Currently, the PC doesn't have any heat problems.

p.s.: sorry for bad grammar (if any), English is not my native language
 

jjhuang42

Distinguished
Oct 6, 2011
79
0
18,660
your GPU isn't strong enough for FPS at higher resolutions/frames per second. I'd suggest you spend money on the GPU and max out your RAM (4GB total for a Core 2 Duo, but check your MB manual first) . Even if you upgrade your MB/CPU/RAM, you'll still be severely limited by your existing GTX 280, which is 3 years old.
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530


I don't want to call you a liar, but it's hard to believe that. Comparing my card to a GTX660 Ti, mine is almost as good as. The 660 Ti has double the texture fill rate, but only because it has double the memory. My GTX 280 has 1GB GDDR3 and 512bit bus. http://www.gpureview.com/show_cards.php?card1=567&card2=670

Anyway, even if I was to double my RAM and get a better GPU, it would surpass my budget. Like I said, I have three options which are listed. And I hardly think that getting a better GPU while staying with a Core 2 Duo E8400 and 2gb DDR2 800Mhz won't bottleneck it. Did I mention I'm on Win7 64bits? Also, my resolution is at 1440x900.
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530
Ok, so I got no answer. Ended up going with the following:
Mobo Asus P8B75-M LE
Memory Patriot Viper3 8GB (2x4GB) DDR3 1600MHz PV38G160C9K
Processor Core i3 3220 3.30GHz 3MB

Was it a terrible choice? What do you guys think?
 
He's right, your graphics card is the weak link.
Its four (near five) generations old, it will perform nowhere near current hardware. The rest of your system was fine.

As proof.
Heres a GTX285 (if anything a bit better than your card) compared to a HD6970. The 6970 wins by a fairly large margin as expected.
http://www.anandtech.com/bench/Product/317?vs=292
Then compare the 6970 to the 660Ti, they are equal in some aspects, but the 660Ti pulls ahead by a fair bit in most benchmarks.
http://www.anandtech.com/bench/Product/509?vs=647

Theoretical benchmarks (like the one you showed) can only tell you so much. Actual benchmarks are what will really tell you performance.
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530


Wow. But I must ask, then. If I get, let's say, a 660Ti but stay with my current E8400 and a mere 2GB DDR2 800Mhz, won't those bottleneck the GPU?
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530


Yeah. It's an old build. That's why I wanted to upgrade it. Can I suppose that if I keep my GTX280, but upgrade my Core2Duo E8400 to a Core i3 3220 and my 2GB Corsairs DDR2 800Mhz to 8GB DDR3 1600 Mhz Patriot, I will "unlock" the full potential of my GPU?

I know, the GTX280 is oldschool by now. But I believe she can pack a way better punch if it has a better processor and RAM to go with it. I don't plan on playing games on 1920x1200 res or turning on all the anti aliasing and stuff. I just want 1440x900 res and medium-high settings with a medium 40fps. Which I think my GTX280 should handle. Am I wrong? If I am, I'm scr*wd. I can't afford a whole new build. I can either grab a new GPU - say a 660Ti - or new RAM, new processor and new mobo. Any ideas?
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530
Ok! So, one more help.

I decided to upgrade the rest first. New build will be

Mobo Asus - P8B75-M LE
Patriot Viper3 8GB (2x4GB) DDR3 1600MHz
Core i3 3220 3.30GHz 3MB

It was the best I could afford. Up to a new GPU, I'm divided between a HD7850 2GB Core Edition and a GTX660 2GB. Where I'm buying, the GTX660 is pricier, so I would go with the 7850, but most benchmarks give the GTX660 the upper hand. What do you guys think?
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530
But is not the 7870. It's the 7850. However, the difference between the two, price wise, is small. And in benchmarks, the 660 takes the cup. So, this makes the 660 an obvious choice...

... right? Or is not that simple?

Also, the new build I told you guys about, everything is cool? Or did I make a huge mistake on something?
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530


I'm from Brazil, so I'm buying from a website from here. Unfortunately, if I buy from say Amazon or Newegg, with taxes, my options will be way more limited. :(
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530


Exactly! The thing that hurts more to me is that if I could get my budget and buy directly from the USA, I could get an i5, better mobo and such. But, that's the way it is. I'll see if I can grab the GTX660, if not, will stick with the HD7850. Thanks for the help, mate!

I must confess I was a bit worried about the Patriot RAM not being compatible with the mobo, but everything (frequency, slot) checks out, so I don't think I'll have any troubles. What did worry me was that those RAMs weren't on that list on the mobo official website. Is that a problem? I couldn't find a single one from that list on the website I was buying. Had to go with the Patriot one.
 
RAM compatibility is pretty simple, if its DDR3 and the mobo is DDR3, it will work.

On Intel systems there is a little known clause though, cant be above 1.5v. While this wont kill the CPU or outright refuse to work if its above 1.5v, it puts more strain on the memory controller than was meant too, decreasing the lifespan of the CPU.
Just checked the RAM kit, your fine.

The official supported RAM list doesn't really mean much and will only include kits around at the time of the boards release (and popular kits at that). If you can get a kit that's on the list, that's good, but its not an issue if you cant.

 

lorgot

Honorable
Nov 20, 2012
36
0
10,530


I thought so but thanks for the confirmation! Actually, you took a huge weight of my shoulders. I was worried with that. Well, I'll wait for the parts to get here and see how they fare. Thanks for all the help, Manofchalk! I'll post later how did everything go.
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530
Hello again!

I was just looking for some GPUs and I saw something that I didn't even know existed. The processor can only handle certain types of PCI-Express connections? I was looking for a Radeon HD7850 to go with my recently acquired upgrades - Core i3 3220 and P8B75-M LE - and read somewhere that the processor has to be PCI-E 3.0 eligible? So I went looking for the Core i3 specs and saw something about being up to PCI-E 2.0? Is it true? If it is, how can I get a better GPU if the processor won't... I don't know... run the thing?

Do you guys know of any GPUs that have a performance as good as a HD7850 or GTX660 that is PCI-E 2.0 or, I don't know, compatible with the Core i3 3220? The mobo says it has PCI-E 3.0, but the processor doesn't? I'm really confused right now.

Sorry, I'm a noob on this.
 
Just off the bat I will say that there is no real difference between PCI Gen 2 and 3, even the strongest graphics cards on the market can run in Gen2 8x just fine.

If you want PCI-3, you need an Ivy Bridge processor (which the 3220 is) in your H77, Z77 or B75 board. A Sandy-Bridge chip can only utilize up to PCI-2, so the slots will be limited to gen2 bandwidth.

Also dont worry about compatibility between the PCI-e slot gen and the graphics card. All the PCI gens are backwards compatible.
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530


But the PCI-E 2.0 is 16x, isn't it? Also, you said it wil run just fine, but will the performance be as good as it would if it was PCI-E 3.0 16x?

Also, you said that the Sandy-Bridge can utilize up to PCI-2, so the slots will be limited to gen2 bandwidth. But if on Intel's site says that the 3220 is up to PCI-2 (which would be the same as the Sandy-Bridge), how come it will be able to run PCI-3?

My point is: if I get a GTX660 (PCI-E 3.0), according to you, it will run just fine on PCI-E 3.0 mobo but with a i3 3220 that supposedly goes up to 2.0 But how about performance? I saw that the GTX560Ti has a good performance as well and it's PCI-E 2.0. Would it be better if I got the latter instead of the GTX660, because of the PCI-E thingy? (I don't even know exactly what is that. I thought it was just slots, like, compatible or not)

* Found the link - http://ark.intel.com/products/65693/Intel-Core-i3-3220-Processor-(3M-Cache-3_30-GHz) - see the revision thingy? Says 2.0 :(
 
Its an Ivy Bridge chip, so it should enable Gen3 bandwidth on compatible mobo's.

OK, I guess a rundown and explanation of terms here will help.
16x, 8x, 4x and 1x all refer to how many lanes connect the PCI slot and the CPU. The more lanes you have, the higher the bandwidth.
PCI 2 and 3 are different revisions, a Gen 3 lane has double the bandwidth of a Gen 2 lane. So if you could get up to 1GB/s through a Gen 2 lane (no idea on the actual number), a Gen 3 lane could have up to 2GB/s (remember, this is bandwidth, not speed).
So a Gen 3 16x slot has double the bandwidth of a Gen 2 16x slot, despite having the same number of physical lanes.
Also all the PCI gens are compatible, a Gen3 card will work in a Gen2 slot and vice versa.

The performance difference between Gen2 and 3 is almost imperceptibly small, and only present in highly synthetic benchmarks. You dont need to worry about it.

As I mentioned above, the bandwidth provided by a Gen 2 8x slot is plenty sufficient for modern graphics cards. PCI Gen3 was largely introduced as a future proofing thing to pave the way for PCI storage mediums that have started to trickle into the market (the OCZ Revodrive for example, a TB of SSD that connects via PCI-e).

Also your better off getting the 660 over the 560Ti anyway, much better performance.
http://www.anandtech.com/bench/Product/660?vs=547
 

lorgot

Honorable
Nov 20, 2012
36
0
10,530


Thanks for the explanation! But I think you got something wrong. The PCI-Express 2.0 isn't 8x, it's 16x just like the 3.0. So, basically, it's almost the same, right?

Well, if the performance difference between 2.0 and 3.0 (both being 16x) is almost imperceptibly small, I guess I'll go with the GTX660. I hope you're not trolling me, buying a card that is incompatible would be quite terrible.