After installing ATI driver, no signal after XP boot screen

wookietv

Distinguished
Jun 6, 2008
3
0
18,510
hi,

I've searched every forum i could find for my problem, so i decided finally to post and see if anyone can come up with an answer.

firstly, here's a list of the components in the pc i just put together as a media center:

Graphic Card: ASUS EAH3450/HTP/512M Radeon HD3450 512MB
MB: Shuttle SG33G5 Socket 775 Barebone
RAM: OCZ Platinum Revision-2 2048MB PC6400 DDR2 800MHz Dual Channel Memory (2 x 1024MB)
CPU: Intel Core 2 Duo E4600 Allendale 2.4GHz

It's primary function is just to hold all of my dvd's on the hard drive to be used as a "Movie On Demand". The graphics card has VGA and DVI outputs. I have a DVI-to-HDMI adapter sitting on the graphic card and a HDMI cable plugged from that to my 32" LCD tv going into the HDMI input.

So, here's my problem.
I get a signal through to the tv as the pc starts up through the POST screens... i get a signal as XP Home SP2 boots up (while the blue bars scroll through)... but then as soon as that screen goes away and where it would go to the Welcome screen, i lose all signal to my TV. it starts to happen as soon as i install ATI drivers. with the stock XP graphics driver, i am able to get a signal, albeit they look terrible, but at least i see it's working. I then go into safe mode and system restore back to the stock driver and i get a signal again.
i've tried the drivers that came with the video card, as well as the latest off ATI's site and both produce the same result.
Oh, and connecting the VGA to VGA does the same thing... works until i install the ATI driver and disables at the welcome screen.
does anyone here have any experience with such a problem?
i appreciate any assistance that can be offered.
thanks
 

Wolfshadw

Titan
Moderator
Plug your system into a monitor and install the drivers for the graphics card. Change the resolution setting to match that of what your TV can handle. Note that not all 720p TVs can handle initially handle a resolution of 1366x768 and have to be set lower (1280x720 or something close) initially.

-Wolf sends
 
If your primary display is an HDTV, getting the new card to recognize it over HDMI is troublesome without first installing the card's driver and using it to "force detect" the HDTV. That conventional LCD or CRT monitor you connected earlier will let you see what you're doing while configuring the driver. Connect your HDTV, using the dongle and HDMI cable, to the other DVI port on the card, and make sure the HDTV is on and set to its HDMI input.
Boot up, and you should reach the desktop on the PC monitor; install the graphics driver and reboot. Then, if you're using an ATI card, right-click the desktop, open the Catalyst Control Panel, and, in Advanced view, hit Detect Displays (in the Displays Manager menu) to locate your HDTV. Right-click the icon for the HDTV once it appears, and choose the Clone selection; this will duplicate the desktop on both displays. (If you have an nVidia card and the HDTV is not detected in the equivalent driver utility—nVidia Control Panel—click on "My display is not shown in the list" and select Rigorous Display Detection on the subsequent screen, as well as "Force television detection on startup.") Once the HDTV is detected via HDMI, detach the monitor from its DVI port and set the HDTV to the appropriate native resolution.
Next, set up your audio output. The HD 3450 card carried our PC's audio signal to our HDTV over the HDMI cable, and we rerouted that audio to our simple two-speaker stereo system via the audio output jacks on our HDTV. If you have an elaborate home theater surround system with an A/V receiver, however, you might prefer to use the audio outputs on your PC proper, depending on your TV's ability to pass through the signal. Either way, in Vista, right-click the volume icon in the taskbar, choose Playback Devices, highlight the audio output you want the PC to use (HDMI, S/PDIF, or analog), and hit Set Default.

 

wookietv

Distinguished
Jun 6, 2008
3
0
18,510
ok, so basically, the root problem is using an HDMI connection to a TV as a sole monitor?
just to clarify, the video card has only 1 DVI port and 1 VGA port. from what you wrote, it sounds like you were thinking there are 2 DVI's.
http://images17.newegg.com/is/image/newegg/14-121-250-S02?$S640W$
that's the card's connections

so as an amendment to your suggestion, i should connect the dongle to the DVI port and connect that to the TV via HDMI and connect the conventional LCD to the VGA port. start windows with the stock drivers with the VGA port as the primary, install the ATI driver and reboot. Clone the display to the TV and once it's detected on the HDMI, remove the conventional LCD from the VGA?
for the audio, i use the PC outputs to my 5.1... that works fine...
thanks
 
G

Guest

Guest


Hi,
I pretty much followed everything you mentioned, but have the problem of losing the signal afte the Windows XP splash screen no matter what I do (after installing the ATi drivers). Any help would be much appreciated
 

opr4492

Distinguished
Dec 9, 2009
1
0
18,510
fOR ALL ----> I have a Pentium4, with radeon 3650 agp card, soon as XP stated to load, then got "no signal". turned out to be "Spread Spectrum" option on BIOS. TURN IT OFF !!! which fixed issue.
This option is alos no good if you are trtying to overclock CPU.