I actually thought that the tearing was normal UNLESS you turned v-sync on, especially for high-end computers. People running benchmarks turn v-sync off always because what it does is wait for the monitor to refresh before it displays another image. It slows down framerate enough for the monitor to display the animations correctly and smoothly. If it doesn't wait for the monitor's refresh then it will tear. This is true with almost any monitor, because many people can run quake 3 with 200+frames per second, but I don't know of any monitor that can refresh at 200Hz.
Even the effect of turning off v-sync when you have bad framerates hasn't really been proven to improve the performance by much. I would only be concerned if turning on V-sync makes the framerates unplayable. Otherwise, the tearing is good. It means your computer is fast enough to put out more frames per second than your monitor can handle, which is true for just about any monitor out there on the market. Just keep V-sync on. Even an older 15" monitor should get you a decent image as long as it has v-sync turned on.
Edit: to prove whether or not what I'm suggesting is true or not, try running your most demanding game at a very high resolution (as high as your monitor will let you) with all details on. If the tearing disappears then yes, your monitor is perfectly fine, and V-sync is what you need. The reason why the tearing will disappear is that the monitor can now refresh as fast as the screen updates are coming, since your computer should be churning off less frames per second, and the monitor should be able to keep up. V-sync is, after all, for when monitors cannot keep up.
Think of it this way. Every second that passes while you are watching your screen is like a bunch of cakes coming down on a conveyor belt, and the number of cakes that come through is representative of the number of fps your computer's can generate. The monitor's maximum refresh rate is represented by the number of cakes that a single taste-tester--who is standing by the conveyor gobbling one cake after another--can eat per second. So if the maximum refresh rate for your monitor at 640x480 is 100Hz then that means that this guy can eat 100 pies per second. Let's say 110 cakes come through per second. Well, the tester can't eat that fast! He can still only do 100 remember? So he eats 100 cakes but 10 are unaccounted for, causing the streaking effect you describe. That's 10 frames that don't get displayed correctly, and the picture ends up looking like it's stuttering because some of the frames are missing! What V-sync does is checks to see how many cakes per second the taste tester can eat for each particular resolution. For 800x600 it may be 85Hz, so the v-sync tells the graphics card to ouput no more than 85 frames per second, and the picture appears smooth. And of course since the human eye cannot distinguish any individual frames at 85fps it does not matter if the computer itself can churn out 500fps or even 5000fps because the picture will look equally smooth at any of these framerates. The TCO standard for refresh rates or possible frames per second for animations that the human eye cannot detect is 75Hz, or 75 frames per second. But in practice, most people cannot tell the difference between 60fps and 75. Below 60 fps most people can start to notice individual frames. This is still being debated, but just think of 75 as the "safety" zone that the TCO certifies. Basically I can tell you for sure that anything 75fps and above is all going to look the same. I would say the same for 60fps and above but people would probably disagree with that.
Going back to the main subject, the reason why reviewers always turn off v-sync is that they are measuring the possible framerates capable by a graphics card. The only reason for doing this is to give the possible buyer a gauge, a compass, a watch that gives him an estimate of how long this graphics card will be able to work before it becomes obsolete. We don't care for 300fps in quake 3, no, but we do care if it means that a year and a half from now Doom 3 will run at 60 fps on the same system whereas a system that only scored 100fps in quake 3 will only run at 20 fps in Doom 3. I say all of this because it's become a popular idea in people's minds that they can actually see 300 fps, and that it's a sign of how good your system is. Well, the first part is false, but the second is true. You can't see 300 fps any different than you can see 75fps, but it does mean that your system is good, because it should have a longer life expectancy for running the newest games in the future. FPS really isn't really that important as long as you can get playable framerates. After all, a game you play on your Xbox, playstation 2, or gamecube is only running at 32fps max, because that's the fastest that a non HDTV display (traditional shadow mask tv) can display.<P ID="edit"><FONT SIZE=-1><EM>Edited by cakecake on 06/02/02 00:46 AM.</EM></FONT></P>