BenQ XL2420t DVI 144hz problem

ChaosHydrA

Distinguished
Oct 13, 2013
27
0
18,540
I just purchased this monitor a day ago and only just now managed to get the damn thing to work on my pc. It worked fine with PS3 with HDMI connection but ithere was no sound.

The PC seems to work fine with sound included when i connect it with the D SUB cable that followed with it when i attach a extension on the end that goes up to the graphics card.

However i cant manage to get the monitor to work with the following DVI DL cable at all though. The monitor will constantly display "No Signal Detected" when i try it through this setup. Is there any way i can make it all work with my setup or did i just waste alot of money on a 144hz monitor? i dont understand how a monitor can be so complicated to work with a PC thats barely 1½ year old.

My PC system:

Windows 7
8GB Ram
Inno3d GTX580 1.5gb DDR3 Vram
 

ChaosHydrA

Distinguished
Oct 13, 2013
27
0
18,540
Yes my old monitor was a Samsung 930bf from 2006 it worked fine with it but the last ½ year i think it has been dying because i wouldnt see my desktop from 1 minute to 1 hour it would randomly pop up in that timeframe after a boot up, also the power button on it was flickering in a erratic pattern not similar to the slow one it had when it was new so i presume it was dying.For a whole month though it never came on so i bought this new one. That monitor was connected same way as this new BenQ by using a D SUB cable and adding a attachment you screw on by the other end that goes ito one of the 2 D SUB ports i have on my computer.

My DVI cables computer end doesnt go up to the computer though it goes up to the motherboard and thats the one im getting the "no signal detected" message from.
 
So your converting DVI into VGA (or D-SUB as you call it) and via VGA cable connecting your monitor?
Its not exactly clear how you are connecting to your monitor.

But looking at the connections on the monitor, that DVI port cannot accept an analogue signal. Its just plain missing the pins for the signal to travel across. So if your plugging into the DVI port and VGA is involved in any part of the process, you wont get a signal. I suggest you get a Dual Link DVI or DisplayPort (preferably this) and hook up the monitor directly, dont use adapters.
 

ChaosHydrA

Distinguished
Oct 13, 2013
27
0
18,540
hi sorry for late answer but i think i solved it with the dvi cable that came with it. One of the reasons i didnt try it out on the back of the graphics card properly i think was due to the reason that the pin alignment is different from the cable.

the dvi slot on the back of the gfx card has a pin holes alignment of 1 big slot and 4 small ones surrounding it besides the 3-4 rows of many small holes next to it. While the DVI cable that came with the monitor only has the 1 big slot with the 3-4 lines next to it full of holes (ie. it doesnt have the 4 smaller pins surrounding the big pin slot). But i tried it anyway and it seems to work fine so far, but now im worried if this is correct or will it cause damage to the pc/monitor on the long run ?

The 2 graphic card slots on the back of my pc looks like this slots pinholes for example: http://www.mytrendyphone.dk/shop/henge-dock-mini-81333p.html

if you notice the 4 small holes surrounding the big one (the dvi cable that came with the monitor dont have pins that can be inserted into those, it just have the big one and the row of holes next to it)

 
Of course it's not going to cause damage. It's just that the graphics card has a DVI-I slot and your monitor uses DVI-D. They're completely compatible; you're perfectly fine.

Also, bear in mind that if you have a graphics card, you're never going to get any video out of your motherboard, period.