Converting Digital to Analog

Basically, let's say I have a 1280x720p monitor that only supports VGA.

If I were to take an HDMI signal at 1920x1080p and convert it to VGA, would it actually be upscaled on the monitor? While I'm fairly certain the resolution on the monitor would remain 1280x720p, a lot of converters claim to be able to upscale a resolution. So if I am correct, my graphics card would probably handle the upscaling?

The reason this is all important, is because we all know that the Default Nvidia upscaling (handled by the Graphics Card) puts more strain on the card than if you were to have a monitor that doesn't need the upscaled resolution. And in the system in question is a 1050 ti, and while not the best of cards, is more than capable of running 1080p, but wouldn't run the best if it had to upscale from 720p to 1080p.
 
Solution
You should set Windows to a desktop resolution of 1280x720 to use it with that monitor.

VGA, because it's analog, doesn't work in pixels. It works via timings. Every x milliseconds it modulates the signal to draw out an entire row of pixels, then proceeds to draw the next row. Every y milliseconds it deems that an entire frame is complete and starts over. These are the horizontal and vertical scan rates you'll see specified on the monitor. Those specs are for max scan rates (which correspond with highest resolution). The monitor can switch to lower scan rates to support lower resolutions.

So to answer your "what if" question, IF the monitor's VGA input supported scan rates capable of 1920x1080, and you fed it a 1920x1080...

Rogue Leader

It's a trap!
Moderator


If you set your resolution to 1080p and used such an adapter the monitor would display 720p but your GPU would "see" the load of 1080p.

Or it just wouldn't display, depends on the monitor, adapter, and what it can handle.
 
You should set Windows to a desktop resolution of 1280x720 to use it with that monitor.

VGA, because it's analog, doesn't work in pixels. It works via timings. Every x milliseconds it modulates the signal to draw out an entire row of pixels, then proceeds to draw the next row. Every y milliseconds it deems that an entire frame is complete and starts over. These are the horizontal and vertical scan rates you'll see specified on the monitor. Those specs are for max scan rates (which correspond with highest resolution). The monitor can switch to lower scan rates to support lower resolutions.

So to answer your "what if" question, IF the monitor's VGA input supported scan rates capable of 1920x1080, and you fed it a 1920x1080 signal, the monitor would handle the scaling down to 1280x720. It would probably be very ugly and blurry, since any monitor old enough to be VGA-only probably doesn't have the newer more powerful scalers built in.

But the more likely case is that the monitor is only designed to accept scan rates which result in 1280x720 resolution, since there was no reason to make it capable of higher scan rates. In this case, if you feed it a 1920x1080 VGA signal, the monitor will either show a blank screen, or display a "signal out of range" error, or it will destroy itself (this was a concern with the old CRTs, very rare with LCDs).

Upscaling is for if your Windows desktop is 1280x720, and you want to display it on a 1920x1080 monitor. The scenario you've outlined is the opposite.
 
Solution


Okay, let's say the adapter automatically scales the 1920x1080p to fit correctly to a 1280x720p display. Now if the analog signal is 'taught' how to display 1920x1080p by the adapter, would the monitor be able to display the image correctly as if the resolution was actually higher? In other words, would the upscaled image actually be 1920x1080p? (As if the GPU had scaled it perhaps?)
 

Dugimodo

Distinguished
No it'll look bad because there are not enough pixels to display 1920x1080 so some detail will be lost.
What you are describing would be downscaling not upscaling.

I'm not even sure such a device exists, but you wouldn't want to use it if it did. Instead you would just set the output to be the correct resolution as solandri said. HDMI can do 720 just fine it's not locked to 1080.
 

Rogue Leader

It's a trap!
Moderator


No.

A 720p screen is a 720p screen. It can display a 1080p image, at full size and shape, but its downrezed (for lack of a better word) to 720p.

In this case it may not even work at 1080p you may (probably do) need to set the resolution to one that your monitor will display.