Are all old computers good for their time?

Disaphi

Commendable
Feb 15, 2016
21
0
1,510
Back then you can run any game with pretty much any pc but now you need a gaming pc to run crisis but back then you could run crisis with any pc. what?
 
Solution
The situation was actually worse in the past than it is today. Computers were increasing in speed at a much faster rate. In the 1990s computers doubled in speed about every 1.5-2 years. Game developers typically had newer and higher-end computers, so new games always ended up requiring very powerful computers. A good trick was to not buy a new game immediately upon release. Wait about 6 months to a year, then buy a new computer (one that was faster than what the game devs used), and use that to play the game. Often this was the only way to play a game with all the settings on high.

Around the late-2000s, the CPU frequency march came to a screeching halt with the ill-fated Netburst architecture on the Pentium 4. Back then most...

USAFRet

Titan
Moderator


No. Even going back 20+ years to Quake I...not all PC's could run that effectively.
 
A little "Frogger" anyone?

S40b952.jpg
 

spdragoo

Splendid
Ambassador


Nope, there have always been minimum/recommended requirements for games.

Even back in the day with my original version of X-Wing (multiple 3.5" floppies), you had to 1) have 4MB of RAM installed on the PC (a lot of them back then only came with 1MB, since DOS/Windows 3.1 was originally set up only to access 640kB), & 2) you had to run a special bootdisk floppy to have it access that RAM.

Things are at least way, way easier now. Back then, you had to worry about hardware conflicts (i.e. those irritating IRQ errors), because you literally could have some other piece of hardware blocking access to the sound card, or you had to use different settings for every single game. Heck, I remember the improvement in how my family's first PC ran once we installed a video card so that would have "true-color" graphics (pretty sure it was one of those 1MB VRAM cards that provided either a SuperVGA or XGA output).

And I swear, I've seen the occasional benchmarking article over the past couple of years where they go back & test the system to see how well it will run Crysis 3 -- like this article here (https://www.techspot.com/review/1174-nvidia-geforce-gtx-1080/page6.html). Note in particular that, until the GTX 1080 came along, you didn't have a GPU that really pulled ahead of the top-line cards they originally tested it with (https://www.techspot.com/review/642-crysis-3-performance/page5.html).

In short...there's always been a variation as to how systems will handle a game, & it all boils down to whether their hardware can handle it or not.
 

USAFRet

Titan
Moderator


Yup.
A lot of games had their own boot floppy with specific settings. You literally had to reboot to play it.

And even then, you often had to edit that to work with your particular hardware.
 
The situation was actually worse in the past than it is today. Computers were increasing in speed at a much faster rate. In the 1990s computers doubled in speed about every 1.5-2 years. Game developers typically had newer and higher-end computers, so new games always ended up requiring very powerful computers. A good trick was to not buy a new game immediately upon release. Wait about 6 months to a year, then buy a new computer (one that was faster than what the game devs used), and use that to play the game. Often this was the only way to play a game with all the settings on high.

Around the late-2000s, the CPU frequency march came to a screeching halt with the ill-fated Netburst architecture on the Pentium 4. Back then most buyers equated clock speed with performance (as a shortcut to running benchmarks). So Intel's marketing department had demanded that they ramp up clock speed as quickly as possible. They ran head-first into the brick wall of physics (current leakage goes as the square of frequency IIRC), and the Pentium 4 became dubious for running extremely hot and consuming huge amounts of power (200+ Watts on higher end models). Intel had to abandon it. That's what allowed AMD to briefly take the CPU lead.

Intel took a couple years to redesign a new CPU based on their Pentium M for notebooks. Those had to prioritize power efficiency over clock speed to maintain battery life, so had escaped Intel's marketing department's mandate of clock speed uber alles. They became the basis for the Core CPUs we have today.

But clock speed has pretty much become a non-factor in increasing CPU speed. It hasn't increased much in the last decade. The fastest CPU in 2008 was 3.2 GHz. The fastest in 2017 is 4.3 GHz. A 34% increase. By comparison, the fasted CPU in 1988 was a 40 MHz 80386. The fastest CPU in 1997 was a 233 MHz Pentium. A 583% increase.

The one thing that has gotten worse for the modern gamer is that high-end GPUs have gone up substantially in price. IIRC GPUs used to top out at just over $200, and $300 was unheard of. Nowadays the high end starts at $500 and can go upwards of $1000.
 
Solution