I'm having some problems with this HDMI setup. My current rig isn't extravagant, but it's all I have at the moment.
I've been playing Skyrim and although the graphics look great I want it to look even better.
My PC Specifications:
Power Supply: Thermaltake 650 Watt
Motherboard: BIOStar AM3 Socket
Processor: AMD Athlon II x2 Regor
Memory: 6GB DDR3 1600 Crucial Ballistix Sport
Video Card: eVGA 9800 GTX dual 6 pin
Hard Disk Drive: 120GB SATA 6GBPS WD SSD / 500 GB 6GBPS SATA WD 7200 RPM
Blu-Ray Drive: Sony BR R/W
Operating System: Windows 7 64BIT Ultimate Edition
I have a pretty good overclock profile also. I'm getting stable 3800MHZ cpu, 1845 MHz Memory and my video card is stable
overclocked to 800 MHz core, 1350 MHz memory and 2000 MHz shader which is pretty damn good for a 9800 GTX dual 6 pin. I
also replaced the old grease with Arctic Silver 5 on the GPU top, GPU memory chips and GPU capacitors.
Now I can run Skyrim on ULTRA settings with FXAA and all distances maxed out + AA @ 4 and Anisotropy @ 4 @ Resolution 1366x768
My display is just a LCD Flat Screen HD Television by RCA which advertises 720p, however,
this display only has a VGA input and an HDMI input. My video card has 2 DVI outputs and a composite S-Video output
I have been using a DVI-to-VGA converter block to output the video to my TV's VGA.
I know that DVI is the absolute best video even above HDMI, but I feel that I have been dumbing the video down by
using a DVI-to-VGA converter and since there's an HDMI input on my TV I would like to use that over the VGA input
because HDMI is better than VGA, but not better than just pure DVI and there's no DVI input on my TV. Also I do not
plan on getting a different TV.
So I went and swiped a gold plated DVI-to-HDMI converter block by GE and a high quality gold plated HDMI cable by Monster.
I hooked everything up and vuala I have 720p video when I boot up my PC and it shows the BIOS/CMOS information, however,
when the PC boots into Windows the video disappears, so I hooked the VGA back up and there's video, so I hook up both connections at once
and I'm getting video on both VGA and HDMI channels on my TV simultaneously.
I mess around with the settings in the windows resolution interface and the settings in the nvidia control panel and every time I change
the resolution through the HDMI from 1024x768 to any other resolution really I lose video.
I don't know if I'm missing something, overlooking something or what? I really want to use the HDMI as primary because
HDMI shows more colors, contrast and just better video all together over VGA. I'm not worried about audio either because
I'm outputting my audio through the motherboards optical out to my surround sound system.
I would like to resolve this and get my video output to HDMI single display performance mode and possibly 1920x1080 resolution
so I can play Skyrim with excellent video.
I figured it out. Apparently this display was mfg in 2007 and the highest compatible resolution through digital hdmi is 1366x768, however this resolution did not show in the list so I had to custom make it in the nvidia control panel. 1366x768 @ 60Hz / 32BIT Color. The difference is quite noticeable. The picture is so much more clear and crisp compared to VGA. DVI is still better though. Skyrim looks way better now. I was looking at a comparison of the PS3/360 versions of Skyrim and the PC version maxed out and lemme tell ya PS3/360 is so lame, people using that garbage hardware are really missing out on all of the nice details and enhancements of the PC version. PC is always so much better.