Sign in with
Sign up | Sign in
Your question

Can I convert a DVI cable to VGA?

Last response: in Graphics & Displays
Share
August 17, 2009 4:43:52 AM

My PC only has a VGA input and I want to plug it into my TV which only has a DVI input (No HDMI either). What are my options for this? I'm afraid getting a new graphics card is my only option.

Thanks in advance guys :hello: 

More about : convert dvi cable vga

August 17, 2009 5:17:34 AM

Ya i just found ouy i can use an adapter. Would using a dvi to vga adapter result in loss of graphics quality?
Related resources
August 17, 2009 5:44:46 AM

yes, but you have to see it for yourself, some people don't notice it at all ( i am spoiled by good gfx)
August 17, 2009 6:00:18 AM

g0tvtec said:
My PC only has a VGA input and I want to plug it into my TV which only has a DVI input (No HDMI either).


Correct me if I'm wrong, you're gonna use a DVI to VGA adapter on what? You said your PC has VGA input? You must mean output right?

Supposedly,

PC -> VGA output
TV -> DVI input

DVI to VGA means you would convert the TV input to the PC output?

Adapters only work with digital(DVI) source to analog(VGA) destination. When doing the other way, you must have some sort of signal digitizer to do that.
August 17, 2009 6:06:19 AM

No, Adapters only work with Analogue (DVI-A / DVI-I) to Analogue VGA, and Analogue VGA to Analogue DVI-A, they don't work Digital DVI-D to Analogue VGA or vice versa, and they don't use the digital part of DVI-I.

For Analogue VGA to digital DVI-D (the usual monitor input, except for a few DVI-A monitors) then you need a converter box not just an adapter.
August 17, 2009 6:09:48 AM

i'd be cheaper to pick a up a budget video card then getting a converter box
August 17, 2009 6:13:45 AM

masterjaw said:
Correct me if I'm wrong, you're gonna use a DVI to VGA adapter on what? You said your PC has VGA input? You must mean output right?

Supposedly,

PC -> VGA output
TV -> DVI input

DVI to VGA means you would convert the TV input to the PC output?

Adapters only work with digital(DVI) source to analog(VGA) destination. When doing the other way, you must have some sort of signal digitizer to do that.

Yes i mean the pc has a vga output only. My tv only takes rca and dvi.
August 17, 2009 6:35:39 AM

TGGA is correct (as usual). You can't take an analog signal and turn it into a digital one using only an adapter - you need some sort of conversion circuit.

Agree with above. A cheap newer graphics card would probably be a better solution than finding a converter box.
August 17, 2009 6:40:13 AM

xc0mmiex said:
i'd be cheaper to pick a up a budget video card then getting a converter box

Actually the adapter is much cheaper than a video card. Its selling for $7 shipped on ebay. However if the adapter will not work, like the previous poster said, i will have to get a new video card.
August 17, 2009 6:43:12 AM

Oh sorry, didnt realize you were comparing a vid card to a convertor box. What is this convertor box anyway? Can somebody give me a link to a convertor box?
August 17, 2009 6:44:18 AM

g0tvtec said:
Yes i mean the pc has a vga output only. My tv only takes rca and dvi.


I would then prefer buying a cheaper GPU with DVI output than use the converter. At least I would have other benefits (gaming, etc) from it other than just converting signals to digital.

see the link that I gave you above for the converter.
August 17, 2009 6:46:14 AM

For a link to the converter see..... First reply to your thread. :hello: 

But really just get a cheap card, about 10% the cost and an easier solution, likely avoiding a few headaches too.
August 17, 2009 7:34:08 AM

And no worries about video loss due to analog. Just get the card.
August 17, 2009 4:52:59 PM

You should get something like an ATI 4xxx. You will need a PCIe slot (not AGP).

The newer cards have the advantage of hardware decoders to offload the task from the CPU. For example, a BluRay rip of Hellboy 2 @ 1080p (MKV, AVC) dropped from 65% CPU load using an X2-4800+ AMD to 3% (yep, THREE PERCENT!).

The easiest way to use these hardware decoders is to install K-Lite Codec Pack (Standard or Full). Use the included Windows Media Player Classic Home Cinema (WMPC-HC); you need to enable DXVA in setup or after in the Options.

Again, look for a newer card that has H.264/AVC, VC-1 and MPEG2 hardware decoding (aka "UVD").

How powerful?

This dependson the following:
1) Budget
2) Power Supply
3) Purpose (gaming?)
4) CPU

For example, an X2-4800+ CPU can use up to about an HD4850 at which point most games are limited by the CPU.

All modern cards are pretty much identical for Windows multi-tasking. It only really matters for gaming (and connectivity such as HDMI).

I recommend at HD4350 256MB or higher ATI HD4xxx.

August 17, 2009 4:58:01 PM

DVI-I:

I thought I'd add that the DVI output actually has TWO outputs on it. It has the pins for DVI (digital) and it has the pins for VGA (analog). The VGA "adapter" is simply hooking up to the analog pins only. The DVI cord does the reverse, it hooks up to only the DVI pins and not the VGA ones.

The VGA signal out the output of modern video cards is created by tapping off of the DVI signal, sending it to an D/A (Digital to Analogue) chip then out the pins. At the monitor, the VGA is then converted BACK to digital via an A/D chip.

Just FYI. It makes little viewable difference in reality, though it's best to stay DVI all the way if possible.
August 17, 2009 6:33:59 PM

Sounds good. Im looking at the best cards for 50 to 100. I dont plan to game at all really. I will be streaming a lot of movies however.
August 17, 2009 7:08:29 PM

I am thinking of swapping out the radeon 4650 on my new pc i bought from HP and putting it in my old pc which i will be hooking up to the tv for streaming movies since it had a dvi output. I can then but a 4850 and use it for my new HP. The problem is that i dont know if it will void my warranty, considering i just purchased it. I will be using my new pc for gaming and i have a 1920x1200 24" lcd.

Hmm.. Decisions, decisions..
August 17, 2009 10:17:11 PM

photonboy said:
The DVI cord does the reverse, it hooks up to only the DVI pins and not the VGA ones.


Depends on the cord, some pass Analogue pins as well (which is required for monitors like my IBM P260 here at work which uses DVI-A), and will do VGA to DVI-A or DVI-I->DVI-A.

The main thing is that it doesn't convert, and whether it passes the analogue information or digital information depends on both the adapter and cable.
August 18, 2009 4:10:29 AM

I decided that I will just keep the video card in my new PC and just upgrade the one on my old PC so that I can use it with my TV. What is the best video card for $75 or under for watching streaming movies?
August 24, 2009 5:04:51 PM

See my comment above where I mention:
-ATI HD4350 256MB fanless
-K-Lite FULL codec pack (WMPC-HC player)
!