fx-8350 vs i5-3570

ryosan222

Honorable
Apr 1, 2013
22
0
10,520
Hi

I know that for games and overall tasks, i5 its better. But in the following case, which one would you buy?.

-I have 2 monitors
-On one monitor I play games on windowed mode and 720p resolution (you know, 720p better performance than 1080p, and without loosing quality being in windowed mode.... and I need to change constantly within programs).
-I don't mind lowering games quality (7850 graphic card would be perfect, so... )
-Lots of firefox/chrome windows open.
-A few text editor windows (open office writer)
-Photoshop to editing small images (very small, and 15/20 kb)
-Dreamweaver

Now I have q9450 + 4870 512 and 4GB RAM (ddr2).... maybe I would need an ssd and/or more ram and I'll would be just ok. But if I buy a new computer.... what's the best? (only cpu, not other components) more weak cores of the fx-8350 or less but more powerfull cores of the i5-3570 (non K).
 

atomicWAR

Glorious
Ambassador
interesting work load and i can see why you asked....in this case i think i would go with amd. the extra threads will help with all the multi tasking though per core performance is less. as for lower game detail....thats actually harder on your CPU then raising them (usually). the reason being is your GPU is rendering at a faster rate flooding the CPU....thats why when you see cpu reviews and they cover gaming they drop the resolution into the floor to make the CPU work harder and thus better show you the reader of how capable the CPU is.
 

Feldmarschall

Honorable
Mar 9, 2013
1,166
0
11,460
Hi,

Simillar thread has been answerd allready.

i5 will be better in almost every aspect.

These benchmarks is with stock and overclocked.

Multitasking:

http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/5- i5 wins

Gaming performance:

http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/6- i5 wins

Editing and encoding:

http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/4 - i5 wins

Power consumption:

http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/7- i5 wins under load, AMD FX 8350 wins during idle.

Best regards :)
 

ryosan222

Honorable
Apr 1, 2013
22
0
10,520


Ey, but that is difficult to understand... I'll try to explain (sorry, I'm spanish and my english.... well....)....

If with 1080p we get... 50 fps (for example), then with 720p we have to get a lot more, less resolution more fps, right?. Then:

-We gain fps lowering the resolution.... but in the other hand/side we lose fps increasing cpu work?

But the really good question is.... how many performance we gain lowering resolution from 1080p to 720p and how many performance we loose throwing more work on the cpu?

 

atomicWAR

Glorious
Ambassador
close...lowering the resolution to 720p causes the GPU to create more frames....now the CPU has more work to do because of it. this won't cause the CPU to lose frames directly but rather limit the work the CPU can do on other programs running in the back ground because it is flooded with frames by the GPU. So its better if you want to have more CPU resources free to increase the resolution to 1080p so the GPU has to work harder and thus does not flood the CPU with frames?


i hope that was better. i am trying. language barriers are tough.
 

im pretty sure i saw on here that on the memory test with 1866 etc gaming performance was exactly the same or better in certain games than the i7. gaming performance you will not notice any difference from what ive seen unless is a cpu intensive game like skyrim.my 8320 @ 4.2 ghz and my 550tis plays bf3
(shittymonitor@1440x900) 80+ fps easy on ultra. even skyrim plays at 75fps with vsync.

http://www.tomshardware.com/reviews/fx-8350-core-i7-3770k-gaming-bottleneck,3407-5.html

either way both are good for what you want, plus intel has on die gpu that can be utilized with your graphics card using virtumvp/lucidmvp for fps boost.
 

Feldmarschall

Honorable
Mar 9, 2013
1,166
0
11,460


That is only true in gaming :)
 

ryosan222

Honorable
Apr 1, 2013
22
0
10,520
the language I understand it perfectly... understand it, but not so perfectly to write it :).

well, now with photoshop, 3 "writer" documents, 3 firefox windows, fraps, and BioShock 3:

720p: 40/50 fps
1080p: .... 20/25 fps

with the programs I don't see any difference/lag/long-time-waiting while using them.
 

Traciatim

Distinguished


I've been trying to figure this out myself with benchmarking different scenarios. However, if you are playing a game and your CPU can feed your video cards only 30FPS, but your video card can render 60 at 1080p then reducing resolution to 720p will not increase performance. In situations where the CPU can feed the video card 120FPS, but still 60FPS on the video card at 1080P, theoretically could increase to 120FPS at 720p (since 1920x1080 is essentially double the pixels of 1280x720 . . . however that doesn't always translate directly because of other factors).

I do know for sure that on my setup (1920x1200 main monitor, 1600x1200 secondary) that using both monitors plugged in to the GTX 670 while my video stream, browser, and other apps are on my second monitor is slightly faster than if I plug the secondary card in to my on board video card. This will probably be true with your setup as well. I haven't tested a secondary discrete video card for the secondary monitor.

The real trouble is determining which thing is slowing you down. If your video card is slowing down your games, then decreasing the resolution will increase frame rates. If your CPU is slowing you down then decreasing the resolution will have very little to no effect on frame rates.
 

Traciatim

Distinguished


Check your Min/Max/Average numbers too. If your Min stays about the same, but the max/average reduce with higher resolution, then the slow periods are being limited by CPU. If all three drop by a similar percentage then the performance reduction is caused by your video card.


 

ryosan222

Honorable
Apr 1, 2013
22
0
10,520


the "40/50 and 20/25" are average fps. But the thing I note when playing games (BioSHock Infinite, not 3.... and The Witcher 2) is that average fps are really good and I can play perfectly, but sometimes, without reason and without a lot of enemies or other things like that, I see a big/momentary drop in fps from 50 and 60 (or even more) to something like.... 10 or 15. Maybe cpu.... maybe HDD.... maybe more ram (only uses 80% of that 4GB)....
 

8350rocks

Distinguished


That's likely your CPU/GPU combo not being able to run enough floating point calculations to keep up. Adding more people into your instance in a game makes the number of FP calculations go up exponentially because your PC now has to take into account their actions with FP calculations to render it properly. Thus the dip in frame rate.

While a better GPU would help immensely here, you cannot neglect the CPU either, as you have to be able to push a high end card well enough to take advantage of it.
 

ryosan222

Honorable
Apr 1, 2013
22
0
10,520


but those fps drops happens in zones without "intensive" things. I mean its not for example a "Crysis jungle" with lots of enemies, lots of npc, explosions.... not that, it happens just in a single street
 

8350rocks

Distinguished


NPCs are not the issue, it's the other gamers that are the issue...you need more horsepower to run that many calculations and it makes your PC lag.
 

ryosan222

Honorable
Apr 1, 2013
22
0
10,520
and then why the game not runs all the time at ridiculous fps? its around 50 fps all the time but in some moments fps goes down to almost 0. If I try to run Crysis 3 sure that the fps are not more than 15... this would be ok, but not just only some moments.