I am searching for a solution to judge the differences in server performance. I have a fairly old IBM server and I would like to test it versus my newer Dell and HP servers. The catch to this is that I cannot use a program that bogs down the cpu or makes the server virtually useless. I wish I could because a benchmark tool, this would be easer than what I am asking. This may be impossible to do with software, but I am not sure.
We run a decent size database on the old IBM and this is a critical machine, if we decide to move the utilities and database off of this machine, it would have to be going on a better server.
If there are hardware Guru's here I may have to pull up the server hardware stats and see if they can judge.
Can't you just install a copy of the database on one of the newer servers and then do a few heavy lookups and see how long it takes on each? If the difference isn't fairly obvious then what does it matter?
You can only properly compare perfomance by running the same program - the one you are interested in - on the computers to be compared. Different programs will give different results. Databases tend to be fairly disk intensive so you might well get better performance on an older server with SCSI drives than on a modern one with IDE drives. But, in the end, if you can't easily see a performance difference then it doesn't matter.
We can do that, and it would work. I may have to try that, I honestly did not think of that. I am just trying to see if the 800$ servers would do us as good as the 5-10k models.
Thank you for that suggestion