Why does windows slow down when I try to start programs simultaneously?

karlsplatzo

Prominent
Nov 25, 2017
4
0
510
I write my own programs for academic projects, mostly involving numerical analysis.
I have one that I'd like to run over several CPUs simultaneously to get through a few runs.
I've set up a script that runs the program (say 30 times). I duplicate the script (say 8 times, one per CPU) and then run it. Each script runs in a different directory.
If I set off the scripts with a little delay between each one then as long as the program is still running by the time I set off the last one, all works well.
The problem I have is that when multiple versions of the program start simultaneously, Windows seems to slow down to a crawl on starting programs. Once the program gets going though it runs at normal speed. The computer remains slow to start other things to requiring a reboot. The shorter time that my code runs, the worse this is because more instances of the code start at the same time.
To rule out my code causing a memory leak, I reduced it to a "Hello World" type code and this slowdown still happens.
I ruled out the compiler causing a problem by using gfortran (64 bit) and Lazarus (64 bit). The effect is the same.
I wondered if using the same name for each instance of code running was the problem but running a different named version of the executable in each script doesn't help.
The problem arises on two separate computers, one running Windows 7 Home, the other Windows 10 Educational.
Taskmanager shows that CPU and memory usage are low when the computer is slow.
Can anyone help?

(I've posted a similar message on Windows 10 so I hope this is OK - I've googled aplenty but keep finding general "why does windows slow down" messages. This is specific.)
 
Solution
The only other thing I can think of is that something in the libraries is hitting hammering a bottleneck somewhere in Windows' APIs that's preventing multiple instances from starting up normally.

InvalidError

Titan
Moderator
That does sound like a storage IO bottleneck. If your program is loading a significant data set or a bunch of libraries from HDD and the system doesn't have enough cache to keep all that stuff in RAM, then starting multiple instances cause them to trash each other's cached data and performance goes down the toilet.
 

karlsplatzo

Prominent
Nov 25, 2017
4
0
510
Thanks for the replies.

Both Windows 7 and Windows 10 machines have Windows on SSD harddrives although the executables and code run on HDD.

However, when testing, I used "Hello World" minimalist executables and these require no data to be loaded. I guess the libraries loading could be an issue. Any way to check that?

I've thought about and tried various delays in starting the code but I can end up with enough instances starting in sync anyway.
 

karlsplatzo

Prominent
Nov 25, 2017
4
0
510
I've had a look at this again this morning. There doesn't seem to be any appreciable disk activity when I run the multiple scripts simultaneously but they always grind to a halt.

The fortran code was compiled with the "-static" option so that libraries are in the executable. I've used the default Lazarus compilation options (without the debugging option set) so I think the libraries are in the executable there too.