I'm looking for advice on how to explain how a program uses multiple cores to new users clearly. I think I'm using terms wrong which is causing confusion to the people I'm trying to explain things to, and I usually can't give enough detail, which adds to confusion.
I'll start with an example:
Say I have an 8 core CPU. If I run a program that I'll describe as only "using" 2 cores, the CPU usage would show about 25% usage for each core in older versions of Windows. So technically it's "using" all the cores not just two... but it's really only using 25% of the CPU, which seems to be where a lot of the confusion arises. I need to word that better.
Windows 10 will keep the main thread on one core, and not randomly spread the instructions to all cores, which gives a nice performance boost, and makes it easier to tell how much of the CPU it's really using.
So there's a balance to achieve. Programs with a bigger main thread perform better on fewer faster cores, but programs with more code shifted to the other threads can make better use of more cores.
So how do I clarify this explanation? It still seems way to long for someone to take the time to read, and I still feel like I'm confusing terms.
Extra Thoughts:
I'm not looking to debate 4 core vs 8 core or AMD vs Intel. I know they both have their strengths. I do realize there are a lot of other things that affect a processing speed, like how it's architecture handles information, or CPU's buffer size. I'm looking for a simplified explanation using basic terms to help people decide how many CPU cores are best for them, and dispel the marketing scam that more cores is always better. It's sad seeing people buy 8 core CPUs to just browse internet or office work on, or seeing people buy 12 core server CPUs only to find they have very poor gaming performance. I'm aware programmers are learning better ways to use more cores and with time multi-core CPUs will grow more useful, though I think the progress is plenty slow.
I'll start with an example:
Say I have an 8 core CPU. If I run a program that I'll describe as only "using" 2 cores, the CPU usage would show about 25% usage for each core in older versions of Windows. So technically it's "using" all the cores not just two... but it's really only using 25% of the CPU, which seems to be where a lot of the confusion arises. I need to word that better.
Windows 10 will keep the main thread on one core, and not randomly spread the instructions to all cores, which gives a nice performance boost, and makes it easier to tell how much of the CPU it's really using.
So there's a balance to achieve. Programs with a bigger main thread perform better on fewer faster cores, but programs with more code shifted to the other threads can make better use of more cores.
So how do I clarify this explanation? It still seems way to long for someone to take the time to read, and I still feel like I'm confusing terms.
Extra Thoughts:
I'm not looking to debate 4 core vs 8 core or AMD vs Intel. I know they both have their strengths. I do realize there are a lot of other things that affect a processing speed, like how it's architecture handles information, or CPU's buffer size. I'm looking for a simplified explanation using basic terms to help people decide how many CPU cores are best for them, and dispel the marketing scam that more cores is always better. It's sad seeing people buy 8 core CPUs to just browse internet or office work on, or seeing people buy 12 core server CPUs only to find they have very poor gaming performance. I'm aware programmers are learning better ways to use more cores and with time multi-core CPUs will grow more useful, though I think the progress is plenty slow.