Can my system not handle games?

Rebe7254

Distinguished
May 14, 2002
121
0
18,680
My system specs are in my sig. I am very disappointed that I have to play Red Faction, Unreal Tournament, and RTCW (demo) with VSync enabled. The tearing is noticably annoying with it disabled. I don't understand why I have to do that. Isn't my rig more than sufficient to play these games without tearing? Also, on Red Faction it's fairly unsmooth with all the features on. Lots of hesitations entering a place with a few enemies.
But Dronez (OpenGL) got hundreds of FPS on average using the benchmark test that's included with the game. The minimum FPS was 84. This is with all the graphics features maxed out, 32 bpp and everything. And if you know anything about Dronez, it's a pretty graphics intensive game. There is no tearing in it and it looks awesome. Too bad I don't like the gameplay, heh. I run everything at 1024x768 res. What's the deal here?

Btw, this goes on no matter what drivers I have. I've tried Detonator 23.11, 27.42, and 28.32.
AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k
 
Its possible you've gotten a bad Graphics card, could you send it back and have it swapped out for another one, do you have another card you could change to so you could see if the problem is the card? Have you run a Direct X diagnostic on the Graphics?
 

Rebe7254

Distinguished
May 14, 2002
121
0
18,680
I haven't ran a diagnostic thing or whatever. This is the second PNY GF4 Ti4400 I've had. The first one was replaced for more serious problems. Somebody over at Anandtech said that my monitor is not good enough to handle the FPS from the video card, and that's why it'll only work good with VSync. I have a generic 17" CRT monitor, so he could be right. I dunno......

AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k
 

Rebe7254

Distinguished
May 14, 2002
121
0
18,680
I haven't ran a diagnostic thing or whatever. This is the second PNY GF4 Ti4400 I've had. The first one was replaced for more serious problems, but it was the same way. Somebody over at Anandtech said that my monitor is not good enough to handle the FPS from the video card, and that's why it'll only work good with VSync. I have a generic 17" CRT monitor, so he could be right. I dunno......

AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k
 

williamc

Distinguished
Mar 8, 2002
837
0
18,980
It is actually very possible that your monitor is the problem. Highly recommend a 17" or 19" Sony Trinitron Monitor. They come at very reasonable prices, have great quality, and have good refresh and resolution compatibility. See if a local store will let you bring your pc in and test it on one with the condition that you'll buy it if it solves your problems-)

If you do end up exchanging the gfx card again, try to get whoever you bought it from to send you a Visiontek instead of a PNY next time...they use higher quality parts and have better QA.

The itsy bitsy spider climbed up the empires state building, along came goblin, wiped the spider out<P ID="edit"><FONT SIZE=-1><EM>Edited by williamc on 05/31/02 10:15 AM.</EM></FONT></P>
 

Rebe7254

Distinguished
May 14, 2002
121
0
18,680
Yeah, I think I've decided to get another monitor. Last night I tried playing on 640x480, changed my desktop to that too, and there was still bad tearing w/out Vsync on. When at 640x480, I even turned the refresh rate up to 100 Hz and it made no dfference. So does this mean the monitor is definitely the problem or not?


AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k<P ID="edit"><FONT SIZE=-1><EM>Edited by Rebe7254 on 06/01/02 11:22 PM.</EM></FONT></P>
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
I actually thought that the tearing was normal UNLESS you turned v-sync on, especially for high-end computers. People running benchmarks turn v-sync off always because what it does is wait for the monitor to refresh before it displays another image. It slows down framerate enough for the monitor to display the animations correctly and smoothly. If it doesn't wait for the monitor's refresh then it will tear. This is true with almost any monitor, because many people can run quake 3 with 200+frames per second, but I don't know of any monitor that can refresh at 200Hz.

Even the effect of turning off v-sync when you have bad framerates hasn't really been proven to improve the performance by much. I would only be concerned if turning on V-sync makes the framerates unplayable. Otherwise, the tearing is good. It means your computer is fast enough to put out more frames per second than your monitor can handle, which is true for just about any monitor out there on the market. Just keep V-sync on. Even an older 15" monitor should get you a decent image as long as it has v-sync turned on.

Edit: to prove whether or not what I'm suggesting is true or not, try running your most demanding game at a very high resolution (as high as your monitor will let you) with all details on. If the tearing disappears then yes, your monitor is perfectly fine, and V-sync is what you need. The reason why the tearing will disappear is that the monitor can now refresh as fast as the screen updates are coming, since your computer should be churning off less frames per second, and the monitor should be able to keep up. V-sync is, after all, for when monitors cannot keep up.

Think of it this way. Every second that passes while you are watching your screen is like a bunch of cakes coming down on a conveyor belt, and the number of cakes that come through is representative of the number of fps your computer's can generate. The monitor's maximum refresh rate is represented by the number of cakes that a single taste-tester--who is standing by the conveyor gobbling one cake after another--can eat per second. So if the maximum refresh rate for your monitor at 640x480 is 100Hz then that means that this guy can eat 100 pies per second. Let's say 110 cakes come through per second. Well, the tester can't eat that fast! He can still only do 100 remember? So he eats 100 cakes but 10 are unaccounted for, causing the streaking effect you describe. That's 10 frames that don't get displayed correctly, and the picture ends up looking like it's stuttering because some of the frames are missing! What V-sync does is checks to see how many cakes per second the taste tester can eat for each particular resolution. For 800x600 it may be 85Hz, so the v-sync tells the graphics card to ouput no more than 85 frames per second, and the picture appears smooth. And of course since the human eye cannot distinguish any individual frames at 85fps it does not matter if the computer itself can churn out 500fps or even 5000fps because the picture will look equally smooth at any of these framerates. The TCO standard for refresh rates or possible frames per second for animations that the human eye cannot detect is 75Hz, or 75 frames per second. But in practice, most people cannot tell the difference between 60fps and 75. Below 60 fps most people can start to notice individual frames. This is still being debated, but just think of 75 as the "safety" zone that the TCO certifies. Basically I can tell you for sure that anything 75fps and above is all going to look the same. I would say the same for 60fps and above but people would probably disagree with that.

Going back to the main subject, the reason why reviewers always turn off v-sync is that they are measuring the possible framerates capable by a graphics card. The only reason for doing this is to give the possible buyer a gauge, a compass, a watch that gives him an estimate of how long this graphics card will be able to work before it becomes obsolete. We don't care for 300fps in quake 3, no, but we do care if it means that a year and a half from now Doom 3 will run at 60 fps on the same system whereas a system that only scored 100fps in quake 3 will only run at 20 fps in Doom 3. I say all of this because it's become a popular idea in people's minds that they can actually see 300 fps, and that it's a sign of how good your system is. Well, the first part is false, but the second is true. You can't see 300 fps any different than you can see 75fps, but it does mean that your system is good, because it should have a longer life expectancy for running the newest games in the future. FPS really isn't really that important as long as you can get playable framerates. After all, a game you play on your Xbox, playstation 2, or gamecube is only running at 32fps max, because that's the fastest that a non HDTV display (traditional shadow mask tv) can display.<P ID="edit"><FONT SIZE=-1><EM>Edited by cakecake on 06/02/02 00:46 AM.</EM></FONT></P>
 

Rebe7254

Distinguished
May 14, 2002
121
0
18,680
So are you saying getting a new monitor will not fix the tearing and it's a waste of money?

AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
Yes, I'm saying turn on V-sync whenever you can for every game you play except when you are getting very bad framerates. The reason for this is that people have reported that turning V-sync off helps their framerates a little bit, maybe by about 5%. Otherwise you should be HAPPY that you are getting tearing cause that means that your system is fast! =P
 

Flyboy

Distinguished
Dec 31, 2007
737
0
18,980
Hey Rebe7254, did you finally fix your problems? Is this the same videocard from before?

FYI, my brand new computer at work has the same videocard, a PNY. It died the second day I had it. I saw strange pixelated looking streaks (horizontal) across the display. It also locked my system up. I told the PC tech guy that I've heard that PNY is BAD news.
 

Rebe7254

Distinguished
May 14, 2002
121
0
18,680
My boot up problems are fixed now. I've had my replacement PNY Ti4400 since this past Wednesday and I've had no problems booting up thus far. However, that stripe down the middle is still there, as it was with the bad Ti4400 and the temporary GF2 MX I had.

Here's my problem. Even with VSync on, I'm gettting quite a bit of tearing in Unreal Tournament. It's tolerable, but still annoying, and it shouldn't be happening with VSync enabled. I can't even barely move without the picture tearing on me. Red Faction (Direct3D), however, has no tearing, but is much more choppy. I just can't win. At first people were telling me it's my monitor, and now I've got people telling me it's not. I'm confused.

AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k
 

MadMechwarrior

Distinguished
Dec 9, 2001
35
0
18,530
Isnt tearing more like the taste tester didnt finish the first cake when the second one comes along and he has to ditch the first one. The monitor is refreshing and in the middle of its process of refreshing the RGB and [-peep-] on the screen, the graphics card updates its frame buffer, so the monitor starts refreshing Frame 1 and by the time its done going through its cycle, Frame 3 is being displayed on the frame buffer...
 

Flyboy

Distinguished
Dec 31, 2007
737
0
18,980
I just can't win. At first people were telling me it's my monitor, and now I've got people telling me it's not. I'm confused
Well the only way (or at least the easiest) to really find out is to swap it out for another monitor. Surely one of the guys at your dad's work could let you borrow one for a night. That's what I would do. Or ask one of your friends. I absolutely understand why your frustrated. I would be to. But you need to keep doing what your doing (troubleshooting) and try to eliminate possibilities. If I recall you have successfully eliminated the power supply (300W), the video (3 cards- all with problems), and the video drivers. So what about the monitor, the RAM (doubt it, but never know), and did you check to make sure the voltage supplied to the AGP is correct (somebody else recommended this to you).

Perhaps you should check for motherboard driver updates too including drivers for the chipset, and/or BIOS.

HANG IN THERE! :smile:
 

Clarentavious

Distinguished
May 24, 2002
332
0
18,780
Ok, some people have their info really messed up here.

First of all, having V-sync ON will LIMIT your framerate to your monitor's vertical hertz scale. V-sync stands for vertical synchronization. This has to do with waiting for the monitor to retrace the image.

The tearing, or flickering (it looks like "cuts" or horizontal lines, if that is what you are referring to) has absolutely nothing to do with your montior. It is game depedant (Half Life might have this problem, while Baldur's Gate might not). If your game has this, the only way to avoid it is turn V-sync on.

If you have V-sync on ou can increase your framerate, by of course, increasing your refresh rate.

Also, you may want to check your DirectX and OpenGL settings. If you have your mipmapping level to best quality, almost any game will run slowly (because Direct3D is so poorly designed) even with a GeForce 4 Ti.

OpenGL is much better, you can attribute that to your higher framerates.

Also, a framerate of 120 is not going to help your performance or quality at all if only on average 40 actual frames are being rendered per second.
 

Rebe7254

Distinguished
May 14, 2002
121
0
18,680
I installed the latest 4 in 1 drivers from Via. It seemed to help quite a bit. Unreal Tournament is not tearing at all now, and Red Faction seems to work a little bit better. Thanks, Clarentavious, for the Direct3D suggestion. Maybe that will help with Red Faction. However, the RTCW demo I have tears exceptionally much, even with OpenGL VSync on and "sync every frame" enabled in the game. I think maybe that could be due to the game itself. It could be that some patches to the full game address this issue. No?

Anyway, thanks everyone again for the advice. Even though my monitor may not be the problem, I'm still gonna buy another one. I've got it narrowed down between a Sony Trinitron G220 (17") and a Viewsonic PF775 (17"). There's only a $10 price difference between the two, with the Viewsonic being cheaper. Do any of you have any suggestions for other brands with similiar specs?

AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k
 

Clarentavious

Distinguished
May 24, 2002
332
0
18,780
On the monitor issue, quality there (aside from your brand name) is mostly going to be based on your DP (or dot pitch).

I have a .25 dp monitor. It is a 17" CRT display by AOC. I can get 1600x1200 32-bit true. The max refresh rate I can get it (at 800x600 16-bit high anyway) is 144Hz

I guess the monitors you are referring to are CRT? I'm not very familiar with LCD digital flat panels (other than they look mighty nice :) ... and are very expensive, Tom has an article on some).

The lower the DP the better image quality (in other words, a .26 will look better than a .28).
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
By not having vertical sync on and seeing 'tearing' I don't think any of us meant that it was horizontal. That may be a PNY problem. As for my analogy, that may be the case. The guy just can't eat fast enough. I'm not sure if the frame buffer gets half drawn and drawn over again. I must admit my description was long and I'll try to trim it down next time. I don't intend to sound like an expert but I just wanted to give my own explanation using cakes. :)
 

Rebe7254

Distinguished
May 14, 2002
121
0
18,680
Yes, they are indeed CRT's. There's just no way I'm gonna shell out that much money for LCD's. Most places I've read say that CRT's are better for moving video anyway. The DP on the Sony is .24 mm, while the DP on the Viewsonic is .25 mm. My current monitor is .28 mm. Is it worth $10 more for the Sony?



AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k
 

UoMDeacon

Distinguished
May 14, 2002
126
0
18,680
In my opinion yes. It's only $10, and as much as I hate saying it, I would choose the Sony over the Viewsonic because of the brand name. Is the Viewsonic definitely a flat screen? Which brand of phosphers are they using? Just speaking I would go with the Sony for reliability and color vibrancy and the such. But then again, I did pass up a 19" Trinitron for my current Samsung 17" LCD :) How much did they want for the Sony? You can usually get your hands on a nice LCD for $600-800 these days.
 

Rebe7254

Distinguished
May 14, 2002
121
0
18,680
I don't know anything about phosphers. The Viewsonic is definitely flat screen. I've found the Sony on pricewatch for $275 including shipping.

AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k
 

UoMDeacon

Distinguished
May 14, 2002
126
0
18,680
My recommendation, walk into some computer store, find both of those monitors, and take a look at them. Play around a bit with the settings while you're in the store and pick the one that you think looks the best. It all comes down to what you prefer. Personally I would probably go with the Sony, but preferences may differ.
 

williamc

Distinguished
Mar 8, 2002
837
0
18,980
I'd recommend the Sony.

Damn, someone had alot of stuff to say on this thread. The whole thing about still having tearing with Vsync ON though is what was makin me think it was the monitor.

Be sure ALL your drivers and your bios is fully up to date man. Always should be the first thing you check.

The itsy bitsy spider climbed up the empires state building, along came goblin, wiped the spider out
 

Clarentavious

Distinguished
May 24, 2002
332
0
18,780
$10 isn't very much to sacrifice (considering what an expensive purchase it is in the first place) :) I don't have very much experience with monitor brand names, so I wouldn't know. As the guy above mentioned, maybe sampling them would be a good idea.

But if the Sony is indeed superior, it sounds like a good idea.
 

williamc

Distinguished
Mar 8, 2002
837
0
18,980
Only monitors i use anymore are Dell 17 and 19" flat screen trinitron monitors with .25 or lower dp. They're basically the same as a Sony with the same stats. Very happy with them. Once you've used a flat screen monitor you'll never go back.


(Note: for the uninitiated, flat screen and flat panel are not the same thing, flat screen is relatively cheap and common now)

The itsy bitsy spider climbed up the empires state building, along came goblin, wiped the spider out