An 18 core processor sounds excessive

Fixadent

Commendable
Sep 22, 2016
307
0
1,780
Honestly, why would anyone need an 18-core processor?

Certainly this is excessive for gaming. Next-gen games will be optimized for 6-8 cores at most, so that's probably the best way to go.

Furthermore, I've read stories that Intel's Core i9 18-core CPU has severe stability and overheating problems, and is fiendishly tricky to overclock. And what is the default clock speed on that 18-core chip? I'd imagine it has to be below 3.0 GHz.
 
Solution
You would NEVER buy an 18-core CPU for gaming unless you are clueless.

(you also have to be VERY CAREFUL on the CPU as the i7-7800X 6C/12T is a lot slower for gaming than the i7-7700K 4C/8T, sometimes over 30% difference)

And yes, Intel's new platform has temperature issues. With the 10-core already consuming so much power and hitting such a high temperature even with excellent cooler it's hard to imagine an 18-core on that platform performing well.

No, people that get those number of cores do tasks that justify the cost of such a computer to do RENDER tasks usually. Time is money.
You would NEVER buy an 18-core CPU for gaming unless you are clueless.

(you also have to be VERY CAREFUL on the CPU as the i7-7800X 6C/12T is a lot slower for gaming than the i7-7700K 4C/8T, sometimes over 30% difference)

And yes, Intel's new platform has temperature issues. With the 10-core already consuming so much power and hitting such a high temperature even with excellent cooler it's hard to imagine an 18-core on that platform performing well.

No, people that get those number of cores do tasks that justify the cost of such a computer to do RENDER tasks usually. Time is money.
 
Solution

Fixadent

Commendable
Sep 22, 2016
307
0
1,780


Nvidia CUDA calculation dwarfs any CPU in render time, in fact, it's a night-and-day difference between the two.

I use the 3D modeling program Blender, and it took my Nvidia GPU's 6 minutes to render an image, while my CPU took 45 minutes.
 

USAFRet

Titan
Moderator


Honestly, why would anyone need 1TB hard drive space?
Honestly, why would anyone need a 32bit processor?
Honestly, why would anyone need more than a 2 core CPU?

Time, features, power...marches on.
Software eventually catches up to the hardware.
 

atomicWAR

Glorious
Ambassador
In there current state? Most folks wouldn't use or need one. These are CPUs designed for the server space, are cut down to cannibalize a little less of that segment and focused on the high end workstation/prosumer/uber enthusiast crowd. Of these users the high end workstation crowd may actually use a chip like this to it's full extent. Not to say some prosumer/uber enthusiast won't but they will be in the minority by comparison. Gamers won't even begin to utilize that much chip for another 10 years or more. I think 8 cores will be common gaming usage in the near future as will 6 cores. Arguably even now. That will be the new mainstream market depending on chip vendor (as core counts are concerned). Users like myself who have been in the HEDT space for a long time, we will be the ones moving to 10-16 core space (depending on vendor).

Me I am thinking either a 10 core Intel chip or a 12-14 core AMD chip as far as upgrades go coming from a 6C/12T i7 3930K. For me it is about PCIe lanes from the CPU (not counting PCH) and total number of cores. I need a minimum of 40 lanes for my usage. Intel only has 44 lanes at 10+ cores and AMD has 64 lanes at 10+ cores. Below that I have to settle with 28 or 32 lanes depending on vendor or less. That and as a high end gamer/prosumer (video encode/decode mostly with a sprinkle of 3D rendering for game production/modding) I usually go with 2 more cores or more then the average high end mainstream gamer would. In my experience there are a fair number of users like myself on Tom's, though we are still a minority. Point being Intel's 18 core chip is simply a desperate attempt to show they have more of something then AMD. Ryzen T-boned Intel and Threadripper kicked them in the naughty bits to boot. Now they are left scrambling to make chips they already have designed work in the server space on the consumer front against a new product that for the most part has them beat hands down, save gaming (which is starting to change with patches/microcode/ bios updates). Just like when AMD launched AMD64 chips against Intel's Pentuim 4. Though that was a little more severe beating IMHO.

It will be another 4 years with the time it takes for chip design/development before another drastic change in performance hits, say like Intel's answer to AMD64 the Core Duo. This is a great time to be a consumer in the PC space.
 

atomicWAR

Glorious
Ambassador


Anyone looking far enough ahead or using a large enough data set will benefit for sure. I was more making the argument for the next 4-6 years for the average HEDT user.
 


"640K ought to be enough for anyone."

(Although it looks like the actual quote may have been, "When we set the upper limit of PC-DOS at 640K, we thought nobody would ever need that much memory.")
 


That's one application. How well will your CUDA GPU handle a million database requests per minute? Slow, that's how.