Trying to connect a Radeon 5870, can't get 1680 x 1050

G

Guest

Guest
Please can you help me. I'm not very techie.

I've just purchased a Dell 7100 with a Radeon 5870.

I'm trying to connect this to a samsung 2053 monitor.

The native resolution is 1680 x 1050, however the nearest option I get is 1440 x 900.

The monitor has a VGA and DVi-d connection while the pc has hdmi and dvi-I connectors.

I'm connecting the two using a VGA cable rith a dvi-I adapter. Is this where I'm going wrong?. Should I be using a hdmi to dvi?

I've managed to select the 1680 x 1050 resolution, using the show all valid modes button, however that only displays in 30hz instead of 60 hz.

Please please help.


Thanks



 
G

Guest

Guest
All I had was a VGA cable and a DVI-d cable.

The dvi-d cable fits in the monitor but not in the pc.

The new pc came with a VGA to dvi-I adapter so I used that.

Is that where I'm going wrong?
 
G

Guest

Guest
That all sounds complicated!

Why can't there be a standard connectors like before?

I'm really lost with all this.
 
All you have to do is make this :-
dvi_d_dual_m.jpg

look like this :-
dvi_d_sgl_m.jpg
 
Just get a DVI cable (monitors usually come with them if they support DVI). It's the way to go anyhow!!!

Besides, at 1440x900 that ATI 5870 is a massive waste. (really need higher resolution to get your money's worth) LOL Even at 1680x1050 you'll be playing everything with all the settings maxed out completely.

VGA is quickly becoming like Parallel printer cables. :)
 
G

Guest

Guest
Thanks so much for all your advice.

I have a dvi cable (as top pic) which fits in the monitor but the pc has different holes (the blade bit has two holes either side of it.) i think it's dvi-I.

All these different dvi variants are confusing me.
 
G

Guest

Guest
And the adaptirs that came with the pc have 2 pins either side of the blade.
 
G

Guest

Guest
Yes that is exactly what I have.

The dvi cable doesn't seem to he the same as the slots in the pc.

As they have 2 holes either side of the blade.

Sorry for the delay in replying. I'm in the uk.

I really appreciate your help.

Thanks.
 
G

Guest

Guest
I've tried updating the drivers, Windows says they are up to date.

Do you think it could be a problem with the graphics card?.

The reason I ask is because when I load AMD CCC, I get very few options (even in advanced mode), nothing like what I have read about on the internet - I'm wondering whether this is because of the display not being at the correct resolution or whether it is a problem with the card.

This looks like the adapter I am using with the VGA cable.... http://www.belkin.com/IWCatProductPage.process?Product_Id=108267

My old crappy GeForce 5200FX displayed 1680 x 1050 ok on my monitor, but this 5870 won't.

I've saved up ages to buy myself this new computer and I'm so frustrated.

Whats the best bet....buy a DVI-D to HDMI cable and see if that improves things?.

Thanks again.
 
G

Guest

Guest
From reading on the internet I don't see why the VGA with the DVI-I adapter doesn't work at 1680 x 1050. Feel like crying with frustration.

 
G

Guest

Guest


So I should just be able to attach DVI cable that came with the monitor into the DVI socket on the PC (even if they look different as illustrated in the image) and it should work?
 
So I should just be able to attach DVI cable that came with the monitor into the DVI socket on the PC (even if they look different as illustrated in the image) and it should work?
So have you actually tried to plug the DVI cable in or did you just look at it and decide that it looked different and thus wouldn't fit or work?
 
G

Guest

Guest
I tried it, the monitor didn't display anything, pulled it out, looked at it and decided they were different and so would not work.

Then tried the VGA lead with the DVI adapter and got the monitor to work (albeit at the wrong resolution) so thought that must be the right route to go along.

I will try the DVI cable again when I get home.
 
G

Guest

Guest
Maybe I'll pick up a new DVI cable on the way home, just in case.

Do I need a dual link or a single link.

Thanks for your patience.

 
G

Guest

Guest
To be honest I'm not sure.

I am now clear that I can forget all VGA with DVI adapter that I've been trying. The DVI cable that I have should work between the monitor and the PC.

I will plug in the DVI cable when I get home and fiddle around with the monitor from there.

I know the monitor has a "source" button on it, but I couldn't get it to work last time.

Just wondering if the DVI input is faulty on the monitor, is that likely??.


 
To be honest I'm not sure.

I am now clear that I can forget all VGA with DVI adapter that I've been trying. The DVI cable that I have should work between the monitor and the PC.

I will plug in the DVI cable when I get home and fiddle around with the monitor from there.

I know the monitor has a "source" button on it, but I couldn't get it to work last time.

Just wondering if the DVI input is faulty on the monitor, is that likely??.
It wouldn't be unknown to happen but it is unlikely.
 
G

Guest

Guest
Well here's the update.

Got home plugged the dvi cable in, monitor would still not display anything. LED on/off light just flashes.

I brought home the acer screen I use at worked and it worked!!

Guess this means that the cable and graphics card are ok.

Monitor has a 3 year warranty so I think I will see if I can get it replaced.

Do u think the dvi socket is faulty?. Surely that spec monitor should work ok with my set up otherwise?.
 
As I said before, it's not unknown just unlikely. I install a lot of monitors and screens and whilst I have had kit turn up that has had dead ports it usually turns out to be a cable/connector issue or a device setting but when those options have been checked if the screen still doesn't work then it gets RMA'd.
 
G

Guest

Guest
The monitor came with a three year warranty. Samsung are replacing it on Tuesday.

Fingers crossed that this fixes it.

Thanks again for your advice.