Sign in with
Sign up | Sign in
Your question

Display choice problem!!!!

Last response: in Systems
Share
September 18, 2006 11:04:01 PM

After 5 days, I returned the Samsung 244T to the store, NFSMW looked/played amazing, seems to be perfect for all other
types of monitor use, except!..FPS, mainly I tested with FEAR but most of the problems I describe apply to Halo
and FarCry also:

Moving forward/back was visually rock solid...

Turning left to right without moving(no direction buttons pressed) was terrible, choppy/blurry/tearing/distorted, V-sync helped,
but wasnt perfect

Turning left/right while moving forward/back...was much better than just turning left/right but not perfect.

Iv tried experimenting with various resolutions/settings using Nvidia's new control panel, the best playable, although text was huge and didnt look as good:

Lowest res available, 480x360
All other settings on full
V-sync on,

Changing the refresh rate didnt seem to do much

It was possible to play at native, and it did 'look' better, but impossible not to finish bottom of the scoreboard, still very hard to see/focus when
panning left/right quickly, not as stable as lower res, Using MouseWare and the game mouse settings it was possible to make the mouse more sensitive, to the point, where it was too sensitive, so Im not sure how a faster mouse would solve this problem,If someone can explain this??

Somehow in firefights the other person always seems to have the edge, and only made a kill if I caught them by surprise..

When I put my old 15" Relisys LCD beside the 244T, I could see when panning left/right that the little Relisys was solid,
and indeed, panning left/right wasnt any different to any other kind of movement, unlike the 244T.

So when I went back to the 15" after 5 days, all of a sudden my score shot back up, I was on a killing spree and in the top
3 in most rounds...I could also make kills while flying through the air..3 kills with 1 clip..close combat was no problem..so much more control!!!

Now...my little U$50 15" Relisys appears to be better than a U$1400 24" Samsung 244T, in terms of playing MPFPS,

Im back to square one, Im even thinking about CRT now!!

Does this problem seem like the popular input lag issue

What is the Maximum size of an LCD display for gaming, before problems start,and are these types of problems only related to LCD/TFT

Which display should I get

More about : display choice problem

September 19, 2006 3:22:25 AM

You probly remember me from your old thread about asking about this monitor. I'm sorry to say i experiance none of these problems your getting. I typicaly get about 100fps in games but that hardly has anything to do with the monitor. The only thing i had to get use to is how my mouse moved on a wide screen. Im use to larger resalutions that arent well wide. Playing in anything but native res though im sure would cause some problems. Only complaint i have ever had about this monitor was where the USB was. Maybe your right they used a different panal because i never see any bluring or any anomaly's. max size for a gaming lcd for me would be about 30" since i wouldnt be able to turn my head side to side as much as it would make me lol. Well unless i moved it back a bit more.
September 19, 2006 4:47:24 AM

Maybe your big panel just has lots more pixels, and the graphics processor can't keep up.
Related resources
Can't find your answer ? Ask !
September 19, 2006 11:25:06 AM

Yeah! I remember you, looking at your spec, your system is quite a bit more powerful than mine, and thinking about what Mondoman said..maybe my system just cant keep up at the crucial times(close combat etc...)..maybe a combination of my CPU and GPU...

Funny thing is NFSMW seemed exactly the same in terms of stability/playability, just bigger with better colours and loads more details...and of the games I mentioned this was probaly the most fps intensive....

Have you tried FEAR 1.07 MP..which is now free to download. It would be interesting to see how it performs on your system, if you like CS you will love FEAR....

Just to clarify, When you say Native resolution, do u mean the settings in the game or the display?

I tried all the resolutions in-game with various types of settings, the lower the res the better it played but looked the worst, the higher the res the worse it played but looked better..
In the end I kept it on the same res as the Relisys: 1024x768, any resolution much higher than this and a black bar would appear at the bottom of the screen

Anyway here are some test results I only wish I tested the 244T at the same res as the Relisys to see the difference....

F.E.A.R 1.07 24" Samsung 244T
Fear Benchmark:

1600x1200 Settings: Full Detail
Min, Max, Avg GPU Temp
24 101 45 50 C


F.E.A.R 1.07 Relisys 15"
Fear Benchmark:

1024x768 Settings: Full Detail
Min, Max, Avg GPU Temp
40 230 90 52 C
September 19, 2006 9:59:33 PM

Not a fan of CS but i would try fear for you to see how it works. By native resalution i mean the native resalution of hte monitor is 1920x1200 thats what everything even your desktop should be run at. I will look for a fear link and check it out to see how it runs. any suggestions on settings i should try ?


Edit. Something to keep in mind when buying a LCD the native res is likely the only one to work in alot of cases. Keep in mind your system when picking a LCD the samsung we have or you had runs at 1920x1200 naive thats what it will look best in and alot of games (or the monitor) will requier it be ran at.
September 19, 2006 10:44:20 PM

Normally I would say it is a system performance problem. When you upgrade from a 15" LCD with say 1024*768 to the Samsung 24" 1900*1200, you get 3 times the pixels on the screen (and a factor 3 of extra work for the CPU and GPU). You system has plenty of performance though, with maybe a slight limit in the CPU, so it seems rather strange that you should run into severe problems.

A question: How do you connect the monitor to the graphics card?
With DVI or VGA (15 pin connector)?
September 19, 2006 11:12:50 PM

EnFoRceR22:
Yeah, I had my desktop on 1900x1200, and also tried the game at this res, but im sure there was a blackband if i remember right, it did look the best at this res though..until panning left/right, as described.

Would be good if we could meet on the same server and test it out...my msn is tukbriz@hotmail.com, Im on xfire as well, I suggest you try everything on full + native...

bga:
I think your right
VGA connector, alas I didnt try the DVI
September 19, 2006 11:18:33 PM

Quote:
I think your right VGA connector, alas I didnt try the DVI

Could be the cause of the problem. At 1920x1200 it's a lot for data to digitize at very small analog tolerances. If you still got the Samsung monitor I would try with DVI, just to rule out that as the source of the problem.
September 20, 2006 12:05:06 AM

Guess im the dummy here. I assumed you were using the DVI not the VGA. the DVI has ALOT more bandwidth available to it. Ill be adding you to my buddy list. got a link to download it? all the sites i seem to find are trying to give it to me at like 1.4k.
September 21, 2006 3:36:12 PM

Maybe the DVI would have helped, I dont know why I didnt try it, I kinda assumed VGA would be better, I didnt realise DVI had higher bandwidth, However It did play best at 480x360 res even though it didnt look as good as higher resolutions, but even at this low res, it was still really difficult not to be the worst player on the server...Does this mean DVI could still be the solution???
Damn!! wish I'd tried it now, to rule it out as you say

Also:
I plugged both monitors into my graphics card and had them sitting side by side, while strafing left/right, the Relisys was the same rock solid, and the Samsung was the same blurry/hard to focus...
Does this in anyway discount the idea that my system might not be up to it, if the Relisys can still be perfect even with Samsung plugged in as well???
September 22, 2006 1:29:29 AM

well running two monitors ona single card is a good way to stress test that :D . but no unless you can set the two monitors to the exact same resalutions like your 15" all the way up to 1920x1200. im not sure if your use to wide screen or not lioke i had said in your last thread i had to get use to the way my mouse worked in wide screen because of the lower res and how wide it is and not square. That doesnt exactly sound like your problem here but it is also something to think about when playing games. To me it seemed like the mouse was lagging also when in fact it wasnt. only thing i can say is every picture becomes blurry the faster it moved just because of movment. when i move a explorer window around it becomes blurry to me for both crt and lcd. You almost sound like me. Your so into perfection you want to find any little thing thats wrong no matter how little it is if its not perfect its not worth having.
September 22, 2006 8:55:01 AM

Quote:
Maybe the DVI would have helped, I dont know why I didnt try it, I kinda assumed VGA would be better, I didnt realise DVI had higher bandwidth,

DVI and VGA are completely different animals :) 
VGA is analog, so the graphics card transforms the data in the frame buffer to an analog signal. It can do this at different resolutions and different refresh rates. A CRT monitor uses this information to directly control the electon beam across the screen. As LCD monitors have no electron beam, and needs to know which pixels to turn off and on, it needs a digital signal. As a legacy measure most LCD's have got an analog VGA connection, but that is connected to a circuit which tries to analyse the analog signal and digitize it. At high resolutions and high refresh rates it is a VERY high frequency signal, and difficult to determine what the value of every pixel was in the original frame buffer.
This gives a blurry screen.
DVI instead is a digital protocol, transferring the frame buffer directly to the monitor, where the monitor now exactly knows what each pixel should display. Refresh rates are not so important, as the pixels on the LCD's anyway takes a while to change. 60Hz is fine on a LCD, while on a CRT 60Hz would give flicker. Also a CRT can display many resolutions as the electron beam is free to move over the entire screen, while the resolution on a LCD is fixed, as the number of pixels on the panel is determined when made. You can ask the LCD to run at a lower "resolution", but that makes it gather several pixels together to form a larger block on the screen - not an elegant solution.

So: LCD's need to always run at native resolution at 60Hz refresh frequency. The higher the resolution, the worse the image, on analog VGA. Always use DVI when possible.

CRT's can run at any (within limits) resolution and gets a better picture with higher refresh rates. 85-100 Hz is ideal. No CRT has got DVI, as it would need to replicate the DAC on the graphics card, and at the prices CRT's are selling, that is not an option.

For technical details see: http://en.wikipedia.org/wiki/DVI

Quote:
I plugged both monitors into my graphics card and had them sitting side by side, while strafing left/right, the Relisys was the same rock solid, and the Samsung was the same blurry/hard to focus...
Does this in anyway discount the idea that my system might not be up to it, if the Relisys can still be perfect even with Samsung plugged in as well???

Did you run the two monitors with the same image or as an extension of the screen (desktop)? Without more knowledge of your display setup (resolution, extended desktop (yes/no), refresh rate, connection (DVI/VGA)) it is difficult to know what goes wrong.
Did you install the included "Magic" display software (it has got the correct monitor profiles)?
September 23, 2006 6:53:02 AM

hey tuk accept me on your msn so we can talk about me trying out that game. been waiting a while also i have a ventrilo server we can use for commincations during these tests.
September 23, 2006 1:48:02 PM

It sounds to me just that:

1. You were trying to run the TFT at a non-native resolution, they are crap for that, and thats one of the main benefits of CRTs.

2. You couldnt run properly at native because your system couldnt cope, 1600x1200 is a great resolution but kills your gfx card :( 

3. You were using a VGA cable, meaning the digital signal inside the PC was converted to analogue, and then sent along the cable, and then converted back to digital. a DVI cable would have just passed it straight through in digital.

4. Your comparative test compared your old monitor at native res to your new one at non-native - not surprised the old one looked better. Also, you had a 4:3 resolution being streched to a 16:9 screen. Another reason for it to look horrible.

The computer just sends data to the monitor, the monitor itself has no influence on FPS/performance of the computer, except in that the new one had more pixels to draw. Any 1920x1200 will have the same problems for you.

I'd recommend getting the same monitor, but also buying a DVI cable, and a 2nd gfx card if you can afford it. If staying with one card it would be preferable to stay in 1920x1200 and drop other settings down than to change the resolution. 1920x1200 is the kind of resolution SLi exists for.
September 23, 2006 1:50:27 PM

Quote:
No CRT has got DVI, as it would need to replicate the DAC on the graphics card, and at the prices CRT's are selling, that is not an option.


Some High end CRTs *DO* have a DVI input actually.

High end CRTs are still cheaper than high end TFTs, and you can pick up gorgeous 22" models that will do 2048x1536@85Hz for £400/$800 or so.
September 23, 2006 1:56:17 PM

Quote:
Some High end CRTs *DO* have a DVI input actually.


Nice, didn't know that. Could you post some links to those 22" monitors with DVI?

About the problems tuk is having, I don't think it is his graphics card. He has a nVidia 7900GTX, which should be more than fast enough.
September 23, 2006 4:32:33 PM

Quote:
Some High end CRTs *DO* have a DVI input actually.


Nice, didn't know that. Could you post some links to those 22" monitors with DVI?

About the problems tuk is having, I don't think it is his graphics card. He has a nVidia 7900GTX, which should be more than fast enough.

The NEC Multisync FP1375X is one such monitor, the manual to which, showing the inputs can be found on their site, Here

He is trying to run in max settings and says that he had to drop below the native resolution (1920x1200) due to performance.

My 7900GT has a core clock faster than his GTX, although the memory is a little slower.

I can tell you that with maxed settings there are more than a few games it struggles with at 1600x1200, so at 1920x1200 I would expect his to struggle if he is wanting maxed settings. 1920x1200 and above are the resolutions nVidia is marketing *quad* SLi at.
September 23, 2006 5:58:47 PM

Quote:
The NEC Multisync FP1375X is one such monitor

Thanks for the link! Nice monitor, but it seems that they are no longer making it. I can find a few sellers which have them in stock, but many sites report them as "retired" and no mention of the monitor on the product pages of NEC.
It is a petty that it is getting increasingly hard to find good CRT's. If you can live with the size and heft, they have got a really nice image. And not to forget - they can look good at many resolutions.

Quote:
I can tell you that with maxed settings there are more than a few games it struggles with at 1600x1200, so at 1920x1200 I would expect his to struggle if he is wanting maxed settings. 1920x1200 and above are the resolutions nVidia is marketing *quad* SLi at.


I would recommend lowering the quality settings before lowering the resolution. But yes, some games will have trouble with 1920*1200 even with his 7900GTX-512MB. I don't know how demanding the games he mentions are. Anybody looked if there are any benchmarks for these games here at THG?
September 23, 2006 11:32:48 PM

Quote:
It is a petty that it is getting increasingly hard to find good CRT's. If you can live with the size and heft, they have got a really nice image. And not to forget - they can look good at many resolutions.


I agree, I still love CRTs myself, and I hate the new trend for widescreen everything.


Quote:
I would recommend lowering the quality settings before lowering the resolution. But yes, some games will have trouble with 1920*1200 even with his 7900GTX-512MB. I don't know how demanding the games he mentions are. Anybody looked if there are any benchmarks for these games here at THG?


Yeah, so would I, I said that in my first post here but he seemed quite keen on the max settings in his original post.

Of course, part of the problem is that FEAR only supports 4:3 resolutions, another reason to hold off on widescreen imho - lack of support.

(yes, it is possible to manually set the res in one of the game files, but you het a horrible stretched image)

EDIT: Found a benchmark: Here

Check the last box of the page.

High quality with soft shadows, no AA, and x8 Anistropic filtering is an *average* of 36 FPS with a single 7900GTX at 1920x1200.

That means that the minimum is going to be quite a bit lower than 37. 7900GTX in SLi gives 66FPS average, which should be fine.

Quote:
the minimum frame rate was 19fps with a single card and 28fps with SLI


19fps minimum is too low in a first person shooter imho
September 24, 2006 4:31:17 AM

He doesnt need dual video for christ sake. I have the same monitor he returned and i play games such as HL2 and BF2 in max res and max fsaa and anastropic filtering along with max settings at 1920x1200. He needed a DVI he did try using hte native resalution but his video connection failed to have the bandwidth needed to push the pixles as it were. He then tried brining the resalution down to see if it was his video card doing it. 1600x1200 might be nice but 1920x1200 is way better. If you want him to get dual sli and pay a extra $500 or so to get 10 more frames thats just crazy imo uneeded especialy quad sli. Niether of his monitors were 1600x1200 not sure where your getting that res from. However the samsung is capable of running 1600x1200 is pretty native ratio. just has black bars on each side its also a 16:10 aspect ratio monitor not 16:9. and for the one looking for hte crt they have been retiered you will likely not find many good 20" + crt's anymore. personaly i cant wait to see crt's go i been waiting for lcd to take over for years now.
September 24, 2006 10:28:08 AM

After all these posts I think everybody should agree with the following conclusion to "tuk":

1) Probably there was nothing wrong with the Samsung LCD.

2) The problems experienced quite likely could be caused by using VGA instead of the correct digital DVI-I connection.

3) The monitor should always be run at native resolution (1920*1200).

4) His system can not run FEAR and some other games at maximum quality settings (see test link in darkstars post above). But by lowering the quality settings a little, he should be fine with his present system.

5) If tuk wants maximum settings, in all games, with a LCD this size, he will have to upgrade his graphics subsystem. Either with SLI in the form of another 7900GTX or better still getting a all new 7950GX2 (and selling his 7900GTX). Quad SLI should in no way be nessesary.
In the tests in darkstars link, they are using a Athlon 4800+ X2 which has a 20% faster clock than tuk's Opteron 170, and CPU do influence the result. Just look at the Pentium 4 results in the same link. But the most dramatical difference is going multi-GPU. In my first post I thought that the Opteron 170 was single core, which would have meant much less CPU power, but after checking I can see that it indeed is dual.

But again - by lowering the qualitysettings a little bit from the max settings, his current hardware should be fine.

6) If he gets the same monitor again (which in my personal experience is a good panel - I have got the same panel at one of my clients), he should be able to configure it correctly by installing the Samsung "Magic" display software, connecting with DVI, and setting the resolution to the native resolution. He should also look at the number of processes running in the background - some of these can eat away precious CPU cycles - and he has none to spare.
If tuk wants best display performance in dark shadows (where LCDs are weak) and wish to be able to use lower resolutions to improve performance, he should find a CRT (DVI not nessesary with CRT's). Any flatscreen 21" or 22" (even used - as long as he can see it before buying) will be fine. Yes, they are bulky and not very pretty, but they can get the job done. The most important thing is getting a big monitor - everything under 19" is just suffering needlessly in front of your PC.
September 24, 2006 1:34:38 PM

Quote:
He doesnt need dual video for christ sake. I have the same monitor he returned and i play games such as HL2 and BF2 in max res and max fsaa and anastropic filtering along with max settings at 1920x1200.



If SLi is not needed for the best experience at 1920x1200, including 4x or 8xFSAA and 16x Anistropic filtering, when *is* it needed? I can tell you that at 8xSS FSAA and 16x Ani my 725/1400 clocked 7900GT is not enough alot of the time, and thats on a 20.1" 1600x1200 LCD, so I doubt a 650/1600 clocked 7900GTX will do all that much better, especially with the extra pixels of 1920x1200. Try playing Oblivion at 1920x1200 on a single card.

About the only resolution higher than that is 2048x1536 supported by alot of decent CRTs.

Quote:
personaly i cant wait to see crt's go i been waiting for lcd to take over for years now.


There are still many applications where either a CRT is best for the job, or an LCD that can do the job just as well will be around 10x the price. Just because you want an LCD you shouldnt force it on everyone.

bga's summary is a good one imho :) 
September 25, 2006 3:17:52 AM

i think i said on many occasions SLI is never needed jsut because you think they should be used dont force it on other people. When they finaly make SLI worth using like i mentioned before by coding engines to make use of it then i might get a setup but it sure wont be a nvidia setup you can be sure about that. I wasnt forcing LCD's on anyone im just happy LCD finaly killed the crt market. my 24" LCD cost the same as my 21" crt cost . Not sure where your getting this redicules 10x the price crap though. maybe like 5 years ago yeah. but when a 24" costs $800 and a 21" crt costs $800 i dont see where the 10x cost comes in. There are about 3 or 4 resalutions higher then 1920x1200 btw im just to lazy to look them up.
September 26, 2006 9:10:01 AM

SLi is not worth using at 1280x1024. Its kinda usefull at 1600x1200. 1920x1200 and above is where is stands out.

At this resolution and above it DOES lead to a significant perfornace gain, as the benchmark I linked earlier shows.

I was not forcing SLi on anyone, I was suggesting that it was the only way he was going to run in that res at max settings etc.

I'm getting the 10x price difference from the fact that the only LCDs considered up to the colour standards and image clarity of CRTs for professional applications are ones like This, at £3,300, or ~$6000, and This, at £2950, or ~$5500.

The equivalent high end CRT would be around £400, ot $800. Maybe not quite 1/10th the price, but pretty close to it.

There are a couple of resolutions above 1920x1200, but very very few, and in LCDs they are very rarely used. In my mind if someone is willing to spend so much on a monitor that they are considering the SyncMaster 244T, at a cost of £800/$1500 or so, then they should be willing to spend that much on their Gfx cards too.
September 26, 2006 1:46:48 PM

I'm really not sure where you get that currency conversion since i paid $800 for this monitor and $300 for my video card give or take a few dollars. Then again im not talking about monitors that cost thousands :-/ My statement wasnt really to say you were pushing SLi it was more of a rebuttle to the pushing LCD on people comment you made earlyer not a accusation. Higher grade LCD's yes do cost upwords of $2000 and yes there are a couple not many above which is what i said thank you for conferming it. the modles of crts that i have seen that do the specific resalution you mention costs more then the LCD i have or did when i got it. I'm not arguing SLI will never have its place i just fail to see its place now.

After looking up benchmarks and looking at yours it seems those two games oblivion and fear have been optimised for SLI and anything that comes out on those engines i would assume have equal capabilitys. Now if those are the games he wants to play 90% of the time he games SLI options for him would be good. However when it comes to just about any other game on the market paying $500 for a second video card for a mear 10 to 15 frames more hardly is worth it. I did notice even with SLI oblivion at max settings was bairly playable its going to be one of those games that needs the next gen video cards to come out for real quality play which in my opinion would eliminate the need for SLI however at that time i assume more engines will have incorperated the optimizations nessisary to utalise SLI.

Keep in mind darkstar i dont mean disrepect to you or your opinions i just believe he should know the whole picture to his question and i welcome other peoples opinions as they make me see falts or different issues i may have over seen like this thread has shown. If oblivion and fear are his games and thats all he wants to play i would agree SLi would probly be better for him and from the looks of my research i would have to reccomend SLI for him. Other wise i would not.

To respond to the proffesional grade monitors for color represintation. As far as im aware he isnt a graphical designer and none of that matters unless he wants that type of graphical respresintation. Yes i am whilling to pay stupid ammounts of money for componants as long as it makes sense to buy them (as in vary noticable enhances in quality and praformance.)
October 8, 2006 2:19:43 PM

Sorry for the delay, Iv had Flu...and also wanted to do more testing before posting more...

I decided to give the 244T another chance, using suggestions from here and other forums, and here is what I found:

Using the DVI cable instead of the standard monitor cable showed a marked reduction in the 'Strafing problem' while playing, also virtually no tearing, it just looks so much better\stable:

App Settings tweaked to run at the 244T's Native resolution of 1920x1200 showed best results, however I noticed that 480x360 setting increased my kill rate and seemed more visually stable somehow, very much like my observations at the start of this thread...

FEAR Combat 1.07
1920x1200 Native
All performance\video settings on Max
V-Sync on

But still not perfect, I decided to try plugging the Mouse MX310 directly into the 244T's USB port, which involved:
Running a 'USB -> Network' cable(supplied) from the 244T's 'Network UP' socket to USB Socket on the MB.
Plugging the MX310 into the 244T's 'USB DOWN socket'

The cable has a USB connector at one end and a little square network connector at the other, the kind that plugs into sockets with the 3 node network symbol.

This Im happy to say made a killer difference, the 'strafing problem has completley disappeared', Iv tested this by swapping the mouse connection between the 244T USB socket and the one on the mother board,
When plugged into the monitor its as rock steady as the 15", and the differences when looking left/right or up/down dont exist, Amazing...

My score has gone back to normal....Ownage!

Then when I plug the MX310 back into the MB, the jerky/scratchy straf starts again...

Ok heres where it gets wierd, Iv noticed when using the mouse in the 244T's USB, just after spawning, 'the straf problem' is there for the first 3-5 secs then its as if something kicks in and corrects it and everything is Perfection!!!!
Also, but only sometimes if I swap the mouse to MB USB it works with no straf problem, but this only happens sometimes, its as if the history of having being plugged into the 244T has helped it somehow...I dont understand this because it never worked before I used this netwrok cable..

Now, this seems to me like the mythical' mouse lag'???????

I dont understand how using this cable could reduce lag, from what i understand the mouse signal has further to travel using this cable

I would rather plug my mouse into the MB USB, would buying a new mouse be an alternative solution?

I would like to hear what people think?
October 8, 2006 5:16:42 PM

Hi "Tuk",
good to see that you were not discouraged from getting a better display after the first round of problems :D 

As you saw, DVI is pretty important on a high res LCD display.

Your mouse problems are fairly incomprehensive for me though.
When you connect the display to the MB's USB port, you are connecting to the included USB hub in the display (which is not really a network port).
Allthough Samsung doesn't say what type of hub (bus-powered/Self-powered, 2.0/1.1) there is in the 244T, I would not think that could cause any problems one way or the other.
In the end the USB data packets enters the same way in the system, because the mouse sends data to the display USB hub, which is connected to the MB. So it is the same USB hardware/software combination which receives the mouse data in the end.
By the way, is the monitor connected to the same USB port on the MB, as the mouse were?

You shouldn't be concerned that plugging the mouse into a USB hub will increase/induce extra lag. The only thing a USB hub (2.0) inserts into the connection is a TT (Transaction Translator), which operates in real-time.
Now, and here I am speculating wildly, maybe your MB has got a problem with Lowspeed (the lowest tier in the USB protocol speed level) periferials such as your mouse, and the TT in the USB hub in the Samsung monitor help, as it translates it into a USB 2.0 high speed periferial.

Anyway, if everything works now, you shouldn't be concerned that the mouse connects via the monitor.
October 8, 2006 9:27:01 PM

Glad to hear you finaly got the monitor working. Not sure why plugging the mouse into the monitor works at all since it still has to goto the computer then back to the monitor if anything i would see how it would add lag. Sounds like perhaps you have a faulty usb port or something? All the stuff you seem to have to do to get that thing to work would have made me say f this lol good thing mine just worked out of hte box.
!