Been reading countless threads on the 8350 vs i7's and some i5's.
Most agree that for builds with mostly gaming in mind; the i5 and definitely the i7 spank the 8350. I agree with this.
But for rigs that are not for mostly gaming (like I will run) the 8350 is the better option as far as $/performance goes.
I will be primarily multitasking with my rig.
I believe multitasking use and multithread processing go hand in hand.
Chrome with 10+ tabs opening and closing fairly frequently
Excel, Outlook, Word all open and editing on
Watching movie or have it paused on 2nd screen.
Playing big name games about 20% of the time with Chrome open
Matching clock speeds alone I did a little math that I think is correct as far as CPU only electrical use goes.
Assuming 3 hrs per day use for a year - 365 days
8350 costs ~$175 and draws 125 W
4690K costs ~$340 and draws 88 W
The 8350 costs $16.45/year to run at $0.12 per KwH
The 4690K costs $11.56/year to run at $0.12 per KwH
So at the $4.89/year difference it would take 33.74 years for the Intel to pay for itself in only electrical savings. We should have consumer quantum computing by then
The Intel would process my gaming better. 20% of my intended use
The AMD would process my multitasking better. 80% of my intended use
Paying the premium for an OC capable Intel with HT with the same clock speed as an AMD does not seem worth it.
Did I miss something?
Is there another HT Intel with a clock speed close to the 8350 that I should look at?
Most agree that for builds with mostly gaming in mind; the i5 and definitely the i7 spank the 8350. I agree with this.
But for rigs that are not for mostly gaming (like I will run) the 8350 is the better option as far as $/performance goes.
I will be primarily multitasking with my rig.
I believe multitasking use and multithread processing go hand in hand.
Chrome with 10+ tabs opening and closing fairly frequently
Excel, Outlook, Word all open and editing on
Watching movie or have it paused on 2nd screen.
Playing big name games about 20% of the time with Chrome open
Matching clock speeds alone I did a little math that I think is correct as far as CPU only electrical use goes.
Assuming 3 hrs per day use for a year - 365 days
8350 costs ~$175 and draws 125 W
4690K costs ~$340 and draws 88 W
The 8350 costs $16.45/year to run at $0.12 per KwH
The 4690K costs $11.56/year to run at $0.12 per KwH
So at the $4.89/year difference it would take 33.74 years for the Intel to pay for itself in only electrical savings. We should have consumer quantum computing by then
The Intel would process my gaming better. 20% of my intended use
The AMD would process my multitasking better. 80% of my intended use
Paying the premium for an OC capable Intel with HT with the same clock speed as an AMD does not seem worth it.
Did I miss something?
Is there another HT Intel with a clock speed close to the 8350 that I should look at?