WHY???? Need suggestions or answers about Graphics

skyline7528

Distinguished
Sep 26, 2006
22
0
18,510
I have a Radeon X1900XTX graphics card with a Samsung 204b 20.1" monitor, whenever I play FEAR the Monitor looks Wavy, and the game almost gets stuck sometimes, which is the Graphics card. Does anyone have any suggestions of what I could possibly do to fix this?

I'm only running 1024 X 768 Resolution, and my ATI Drivers are up to date. I have tons of cooling, so it is not the heat.

Any help would be much appreciated.

200Gb Seagate Sata
AMD athlon 64 3800+ 2.4Gb
Radeon X1900XTX
Accelero X2 Cooler
Corsair RAM 2Gb
 

SciFiMan

Distinguished
Apr 19, 2006
385
0
18,790
Does it do the same thing when you run at the monitors native resolution? Does it do this in other games also? Do you have the latest patch for Fear?
 

Ford_Prefect

Distinguished
Jun 12, 2003
49
1
18,535
I agree - sounds like you're using monitor scaling, in which case the image is being stretched to fit the monitor's native resolution of 1600x1200. The result is what you see.

If you go into the ATI driver there should be an option to use monitor scaling (I think it's that one - just use the one that isn't in use at the moment). This will give you an properly scaled image surrounded by black borders.
 

skyline7528

Distinguished
Sep 26, 2006
22
0
18,510
I will try running the latest patch for FEAR, and you guys are saying to set my monitor to run on scaling? Cause right now it is on 1600 X 1200, so I think that is the problem, explain it to me again what I should do.

Thanx
 

archie123

Distinguished
Sep 17, 2006
29
0
18,530
First of all , you have an x1900xtx and your running FEAR on ONLY 1024x768?????????? sort it out man ;) crank up the resolution.....

Download and install RivaTuner Pro , carefully read the insructions , you can set it up to monitor temps and clockspeeds etc ingame , its a must for troubleshooting , this will help diagnose your stuttering ingame , as for the wavy lines ,l if its not heat related then it could be due to your res being too low as has been mentioned.

my Riva Tuner in action (in white at the top) , core speed , memory speed and temp , you can set it to monitor much more if you like.........

Image00001.jpg
 

archie123

Distinguished
Sep 17, 2006
29
0
18,530
ok this will take a while , are you sitting comfortably.

open Riva Tuner , under the MAIN tab click the CUSTOMISE arrow in the TARGET ADAPTER box.

In the resulting window click the HARDWARE MONITOR button (looks like a filmstrip with a magnifing glass)

In the window that opens click the SETUP tab

in the next window you will have a selection of monitors put a tick next to CORE CLOCK\GEOMETRIC DOMAIN , MEMORY CLOCK , CORE TEMPERATURE , CORE VID click APPLY

In the same window click the PLUGINS tab and make sure all the plugins are selected (it doesnt matter if you need them or not its easier just to select em jusyt in case) click OK then OK again to get back to your newly setup HARDWARE MONITOR this will now show in graph form all your settings such as temp vid card speed etc.

now for the onscreen display bit RIGHT CLICK each graph and select SETUP you will be presented with a new window near the bottom there is an ON SCREEN DISPLAY SETTINGS box put a TICK in that box and click ok , do this for each setting you have in your graph core vid core clock etc.

Nearly done , you'll notice your voltage probably isnt being detected properly so right click your CORE VID graph and select SETTINGS then at the bottom left click the MORE tab in the resulting window click AUTOSELECT then click OK

When your back to your GRAPH TABLE (monitor) RIGHT CLICK any setting and click SETUP when the resulting window appaears wait a sec or two till the RUN SERVER box appaears in the ON SCREEN DISPLAY SETTINGS bit you checked earlier then click the RUN SERVER tab then click OK

Leave Riva Tuner and your Graph running in the background , in your taskbar you should see another riva tuner symbol , thats your monitor.

start up a game and hey presto all is there to be seeen , you can also display framerates etc this way but it all gets too cluttered so stick with the basics for now.

So when your running a game and it starts to screw up you'll be able to see exactly what your voltage , core/memory speeds and temps are doing so you can rule out those probs. For example if your voltage drops when it starts screwing up you know its your PSU.

You can also change the colour and position of your on screen display but ill leave that to you to figure out ;)

hope this helps

Darren
 
G

Guest

Guest
In the new nVidia drivers, you have an option to remove any sort of scaling, so if you play at 1280X1024, it's in the middle of you screan...

Probably an option in the ATI drivers to do the same, I would try it out! As mentionned above, im pretty sure that's where the problem is
 

archie123

Distinguished
Sep 17, 2006
29
0
18,530
I agree with you on the wavy lines thing mate , probably a monitor related problem but thats outside my knowlage ive always been CRT.

But the ingame stuttering is another issue i CAN sort ;)

Incidently Skyline you are running the latest version of FEAR yes? V1.07
 

skyline7528

Distinguished
Sep 26, 2006
22
0
18,510
Yes I am running the latest version of FEAR, i turned scaling off and it did nothing, but I turned the resolution up to 1600 X 1200 and it works much better, i think this is due to my monitors best performance is at maximum resolution.
 
G

Guest

Guest
That's exactly what scaling is!

You monitor has 1600X1200 = 1920000 Pixels . Each pixel will show a particular color. If you run it a 1280X1024, you will have 1310720 pixel outputed by the video card on a 1920000 pixels screan.

This means that 1.46Pixel will be used to show 1 color, since a pixel can't be use in "half", the Monitor rely on Interpolation to try make it look fine, rsulting in image quality loss, and/or some jittering. This is what is refered as Image scaling Scaling.

Your Screan could show 800X600 perfectly fine because 4 pixels would be used to show 1 particular color...

Hope it clarifies some of it and help you understand better whats going on!
 

Doughbuy

Distinguished
Jul 25, 2006
2,079
0
19,780
CRT's can run diff resolutions without too much of a problem, but LCD's start looking really bad when its not default. Don't know why anyone wouldn't keep default settings, since it looks the best anyways.
 

archie123

Distinguished
Sep 17, 2006
29
0
18,530
CRT's can run diff resolutions without too much of a problem, but LCD's start looking really bad when its not default. Don't know why anyone wouldn't keep default settings, since it looks the best anyways.

Because there is a huge gap fps wise between 1024x768 and 1600x1200.

Nice explanation Labbby :D you learn summat new every day :D Im still not convinced about lcd's and gaming though ;)