turkey3_scratch :
Jacob McIntosh :
turkey3_scratch :
Jacob McIntosh :
Server PCs are used to run internet servers, like websites or online gaming servers (like Minecraft). They usually have Xeon processors , lots of RAM, and lots of storage.
My web server hosts multiple websites and runs on a Pentium, 4GB RAM, and 500GB storage and very little is used up.
They don't
need to have Xeons and tons of RAM. I ran a Minecraft server for a short period of time on a Pentium 930D, 4gb of RAM, and a 5100rpm 500gb drive.
Aside from Minecraft servers, many web servers have very weak CPUs because they have no need anyways for a lot of CPU power. All the computer really does is feeds files and calculates some PHP code. Minecraft though is different.
Rather than answering this question in the context of hardware, it would be better to look at it in the context of what a server is meant to provide. Servers act to provide remote services to clients, serving their requests. The client can be anything from a mobile device, a desktop PC, a robot or even just a sensor which is generating data.
Server's are not meant to be actively used with a keyboard/monitor/mouse, they are meant to serve requests remotely over some kind of communication channel.
Now, in the industry, server-class hardware typically means hardware focused on maximizing the turn-around time for requests from client. Server applications, however, vary, and the hardware needed for each type of server varies.
For data centers which need triple-9s reliability, you need fault-tolerant hardware, ZFS mirrors data stores with remote backups etc etc. For a web server at your house, you probably can get by with an old processor and some hard drives.
There are also render farm servers, which basically handle remote render jobs for 3D animators.
So yeah, a server isn't necessarily a server because of it's hardware. It's just that server-class applications typically need special hardware to meet the application/MTBF/MTTR requirements.