I've been scowering the interwebs to try to find a solution to my problem, but yet I have failed. So I come to you guys, hoping one of you possibly has a solution.
I got hooked on Game of Thrones this past weekend, and hooked my computer up to my HDtv via VGA to watch some more episodes. First I tried to use my DVI-to-VGA adapter connected to my GPU and then hooked the other side of the VGA up to my tv and it didn't work. Since that didn't work, I figured I couldn't connect my tv to my graphics card and disconnected the DVI adapter plugged the other side of the VGA into my motherboard connection and this worked perfectly. So in my head I was like, 'Sweet I can have my GPU still connected to my monitor and my tv connected to my MB also and I can switch back and forth as needed without disconnecting/reconnecting the monitor/tv.' (For reference, I'm using a DVI cable connecting my GPU to my monitor.)
Sure enough when I wanted to switch back to using my monitor, it wouldn't work. I turned my tv back to the VGA input and there it is working as the primary display. So I disconnected the VGA hooked up to my motherboard/tv and re-seated the DVI cable on my GPU and the monitor wouldn't pick it up. Now I take the VGA cable and connect it to my monitor and the other side to the GPU with the DVI-to-VGA adapter and still no luck. The only way I can use my monitor now is if I connect the VGA to my motherboard/monitor and it defaults to low screen resolution.
I kept looking for something in Windows to select the GPU as the primary output, but I couldn't find anything. Also checked in the Bios screen and had no luck finding how to change the image output to revert back to my GPU rather than my MB.
Sorry for such a long post, but I wanted to explain everything so you guys could possibly point me in the right direction. Also I will post my computer's specs/parts if needed. My next step is to order an HDMI cable and try to hook that from my GPU/monitor, I'm probably going to do this at work today.
I'm not sure what lucid virtu graphics is, so I don't know if I have it enabled. I'm running an Intel i5 3570k CPU on an Asus PZ77V-PRO mobo with a HIS IceQ X Radeon 7850 GPU. I've also gone back and updated my mobo and GPU drivers.
It is a capability that your system supports that allows the integrated video to be used when you don't need the horsepower of your stand-alone GPU -AND- when the horsepower is needed, it allows that horsepower to pass through the integrated video from the dedicated GPU.
The reason I ask is because you imply that you are able to access the video from the integrated video while not from the dedicated video.
If you went into your BIOS and disabled the on-board/integrated video and then connected your display to the video card, you would see the display properly.
Thank you very much man, but this only half fixed my problem. Now I can use my tv/monitor using my VGA cable connected to my mobo's on board GPU. It still bypasses my external GPU while using this DVI cable. I didn't try my VGA cable connected with the DVI-to-VGA adapter on the GPU, I will try it after i post this.
Also this was my first build, I built it about three months ago. I'm not an expert, but I am pretty tech-saavy.
Like I said I'm fairly new to PC building, so I'm not sure what Virtu is; is it common in Asus motherboards? And I am definitely going to only use my monitor now if I can ever get my output display to revert back to my graphics card. Do you know how to get my computer to do that? My GPU's fan is still spinning, but I haven't re-seated the graphics card yet. I wanted to try a new DVI cable or an HDMI cable first; I've already ordered an HDMI cable from my work, so hopefully I'll receive it before the weekend. Any help is appreciated.
With your display connected to the on-board video, boot the system and press DEL to enter into your BIOS. You will need to locate your video settings. There, you will disable the on-board video adapter OR make the PCIe video card the primary video device (different venders give different options). Save the settings and power the system down.
Now, reconnect the display to the stand alone video card. Power the system on. This should now have HD7850 as the primary video device and it should output to your display.
Alright just got home from work and tried to make my PCI-E video card my primary, but I am having trouble finding how to do that in the BIOS. For reference, I am running Asus BIOS 1015 and in Advanced Mode it says it's America Megatrend V. 2.10.1208 if that matters.
There was a PCI-Exx selection, but you can't disable it. The only options are Auto, X2, and X4 which are performance related. I did find the Intel OPROM mode (not sure what it was, but I figured it stood for the Onboard Memory) also under the advanced tab and disabled it. This is what I did last night and it gave me 1920x1080 resolution using the VGA cable from my on-board video.
I tried connecting the HDMI and a new DVI-to-HDMI cable I was able to acquire from my work to my external GPU and it still doesn't work. Is there an easy way to change my primary video output? Thanks again, hope you can help me fix my problem.