One of my machines (the one I am currently using to type this), is pretty old. I had to replace the monitor for it recently, but naturally, 90% of monitors were of widescreen 16:9 ratio.
Well now I have the new monitor, an Acer 21.5". It has a maximum widescreen viewing size of 1920 x 1080.
But the fact that this machine is so old has led to a problem.
This machine only supports 4:3 ratios, 1280 x 1024 being the largest AR I can obtain on it. I have updated drivers and looked at possible BIOS updates that would help the computer support widescreen ratios. But it's all sounding rather risky.
I'm not fond of stretched, distorted images and I would like to have a normal viewing aspect ratio for my monitor. It doesn't even have to be the maximum 1920 x 1080- I'd even settle for 1440 x 900.
So my question is this:
I've inserted PCI cards into some of my other machines before, but I've never done anything with a PCI video card. To avoid risky things like BIOS changing, if I bought a PCI video card that supports widescreen ratios, and ran my monitor through it instead of straight to my motherboard's interface, would that work? Or would the BIOS maximum setting of 1280 x 1024 in 4:3 still over-ride that?
I don't want to buy a new video card if it's not going to even work.
I know it might be a silly question to some, because it's such a simple question. But an answer to this question would be much appreciated!
I don't game or anything on this computer, it's mostly for media (viewing videos), and Digital Audio Workstation usage.
First, you don't want a PCI video card; you want a PCI Express x16 card or an AGP card, depending on which kind of slot your motherboard has. If you ONLY have regular PCI slots, forget it; they don't have the data transfer bandwidth to support a modern video card.
Second, if you put in an add-on video card, you would have to go into the BIOS anyway and turn off the motherboard's built-in onboard video. This is pretty much standard procedure for installing a standalone video card when you already have onboard, and there's not really a way around that. So no, the maximum resolution of the onboard video would not come into play because it would be shut off entirely - but yes, you have to go into the BIOS to shut it off. This is not the same thing as a BIOS update, and is in fact one of the simplest things you can do in the BIOS and not really "risky."
It would help if we knew the specs of your computer (model number if it's a pre-built machine; make and model of the motherboard, CPU and power supply if it wasn't pre-built).
If you did buy a PCI video card, the settings on the BIOS wouldn't matter anymore, as the video card is handling everything about video now.
You might want to check first the supported list of resolutions that the video card you want to buy. Sometimes the supported resolution depends on the driver, sometimes the hardware.
You could insert a PCI video card into that (good luck finding one), but the most you'd be able to do with it is probably play Standard-definition movies. Hi-Def movies require a lot of bandwidth and power and the PCI bus may be saturated by it.