ATI DVI TO HDMI ADAPTER
Hello, I have a quick question, here it goes. I got an ATI 3870 card and it has a DVI to HDMI adapter included with it. Can I use the ATI DVI TO HDMI ADAPTER that was included to connect to my NVIDIA 9800 GTX???? I just got a new monitor with the HDMI slot and thought hhhmmm. Thanks for your help. Cronydog
Yes and No:
First of all, this HDMI adapter has caused a lot of confusion as their is no complete audio solution on the video card. The audio is essentially just routed from the movie and out the audio portion of the ATI HDMI dongle.
This IS a proprietary HDMI dongle. Will it work? You will almost definitely get the video as your DVI-I output is standard but you will NOT get any audio.
Two other important points:
1) You need to toggle between your regular audio card and the "ATI Hi Def" solution for it to be running. The hassle really isn't worth it it and I would have much rather seen an adapter for an audio card's digital output.
2) Most DVI outputs are DVI-I meaning they contain both the wires for Digital as well as Analog. Some newer HDTV's now have PC-HDMI but most have either a regular HDMI input. Some don't even have a PC-VGA input.
I use the PC-VGA input for my Sony Bravia. I can't tell the difference between analog and digital really but the PC input gives me all the proper resolutions like a PC monitor would.
If you use the regular HDMI input, usually with a DVI to HDMI (no sound) adapter you MUST conform to the standard of 480i, 480p, 720p, 1080i/p.
1080p: You MUST choose 1920x1080 and only this resolution if your video supports this. You can see the issues for gamers right away.
720p: You MUST choose 1280x720. However most smaller HDTV's use 768 lines (there's only 1920x1080 and 1366x768). But, you need to choose 1280x720 for 720p so this means all of your smaller text will be blurry. The only possible solution is to set your HDTV to use only 1280x720 physical pixels which results in black bars all around the screen.
Since you mention this, I'm wondering if you are using a HybridPower capable motherboard? Anyway, at this stage of the game the only PC I recommend is a solution with a Gigabyte or Asus HybridPower motherboard and 2xSLI 9800GTX. The software will improve and the complete power down of the two 9800 GTX cards will need less manual intervention as time goes on.
NVidia plans to make ALL motherboards like this which have an onboard graphics solution. I'd like to see NVidia offer newer cards with NO outputs on them except hot air as well as stripping out BluRay and anything else not needed for hardcore gaming as this will be handed by the motherboard so it's redundant (for those with the right motherboards).
It would be nice to see full audio output via HDMI as well as a separate digital link for receivers.
Here's my long answer, but you sounded like you could use a little PC help.
I said your DVI-I output was standard. What I meant was it had standard digital and analog video outputs. Obviously additional wires have been added for the audio to match up with the adapter. These wires would NOT be present on most other video cards. Your 9800GTX has only groupings of wires for the digital (DVI) output and another grouping for the analog (VGA) when an appropriate adapter is used.
The digital video is created on the video card and just before it is output it is split in two and one path goes into a Digital to Analog (A/D) converter. The DVI-I output contains both groupings of wires but only the Digital or Analog is selected at any one time.
The audio for the ATI's HD3870 is now a third grouping of wires. Only audio in a suitable digital format for HDMI will work. Again, this solution not only sucks but caused way, way too much confusion.
Vista broke support for EAX. Creative has a workaround so anyone interested should do some reading. There is a list of games, but I believe non-listed games will work too. It looks as if the tool has dropped from $10 to free (at least for the Audigy).
I periodically check install Vista and see if it's worth upgrading as I'm a gamer. As of now I'm still using XP Pro and I see little reason to bother with Vista despite it improving.
I need none of Vista's extras. The much touted DX10 currently provides little actual visual improvement. In fact, all DX10 games happen to be very demanding on the hardware nobody can play at full quality at 60FPS so the game will run much BETTER USING XP and DX9. (this will change)
Word has it that game developers are eventually going to drop Direct X in favor of using their own solutions. This also means a Microsoft OS may not be needed at all!
With Sony letting game developers develop for Keyboard/Mouse and console graphics improving, the PS4 is going to seriously wound PC gaming. People are getting fed up with all the PC gaming issues, the PS4 will be just awesome, inexpensive, provide BluRay/DVD of course, not to mention other options including probably a half-decent Linux solution. The PS4 really is PowerPC computer at heart.
The PS3 is already here of course, but the graphics for PS4 will be better and many of the people not satisfied with the PS3 graphics will like the PS4's. Yes, the PC is always capable of providing more graphical power but at what price? Game developers love having to code for a single device and who wants to mess around with settings and haunt support forums for all their stuttering and other PC issues?
I guess I'm saying that I'm a PC gamer like you but this is my last PC if there's some good Keyboard/Mouse games for the PS4. I'm still playing NWN a lot so I've realized that the absolute best graphics aren't the most important part of the game. PS4 will be "good enough" provided the games are there. See you on the other side.
Sorry to babble...
HD3870 512MB, 2GB, X24800+, XP Pro