gf2 gts/pro 3dmark2001se + gf4 ti4200 128ddr

kai_kai

Distinguished
Mar 29, 2002
40
0
18,530
hello

my system runs @ 2.2g cpu, asus p4s533 mobo, 256mb pc2700, n a gf2 32ddr gts/pro..
n i got a mark of 4358
---> http://service.madonion.com/compare?2k1=3688907
now i wonder if this result is really low n may be much better w/ any additional settings or tweakings??

i also got a gf4 ti4200 128ddr recently..heard its clocked abit slower than its 64ddr version..is this true?
its default @ 250/446..how card can i push this card to?
to the default ti4600? possible?
im goin to oc this gf4 ti card soon..so plz lemme no anythin i can do to make it go as wild as possible.
thx alot~
 

HonestJhon

Distinguished
Apr 29, 2001
2,334
0
19,780
hmm...
on my athlon 900 (oced to 1.07 8x133), with a msi geforce2 pro (clocked to 225/451 stock cooling), i got 4,302 3dmarks with 3dmark 2k1 SE.
<A HREF="http://service.madonion.com/compare?2k1=3658673" target="_new">here</A> is my compare url...
shows you what the benefits of overclocking are...just make sure you have REALLY GOOD case ventilation...because if you dont, then you will have some problems...and might not be able to get it as high as i did.
i got an extra 300 points by overclocking the video card.
i still have some ramsinks to throw on the card, but i need to cut them in half, and that would MAYBE let me up the ram speed even higher...
check the ram ns rating on the ram chips themselves...
i have 5ns ddr...
you MIGHT have faster...think i remember there being some pro cards out there with 4.5ns ram...
i suggest rivatuner or the coolbits registry hack to allow overclocking...
also, the newest nvidia drivers, 29.42 lowered my 3dmakr 2k1se scores slightly...even if i overclock...
i hope that helps you out.
oh yeah, riva tuner can be used for all nvidia chips, from the tnt on i think....
good luck, and remember to take it up one notch at a time, test, and then up it another notch...
only do the hardware overclocking availible in rivatuner if you really dont mind it...
i used the software overclocking....


-DAvid

-Live, Learn, then build your own computer!-
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
Yes it is clocked lower at 250/444 core/mem. You should easily be able to get it to 250/500 though. It's highly unliekly you'll get it to Ti4600 levels unless you use some sort of cooling other than air. I've seen Ti4200 perform better than a Ti4400, but not quite to Ti4600 levels. Use Powerstrip or RivaTuner, two great overclocking utilities.

My firewall tastes like burning. :eek:
 

kai_kai

Distinguished
Mar 29, 2002
40
0
18,530
hey guys thx for all ur inputs..it really clarifies my questions..so i finally used riva tuner to oc my gf2 gts card to 200/400 and gf4 to 250/500 now..currently i have not found any unstable weird things happenin..so i'll let em run for more time on my 2 systems..n post the results again if they work good still..thx again~ =d
 

starvinmarvin

Distinguished
Jun 17, 2002
90
1
18,665
Just one or two points here. In a couple of other forums that I've visited, nVidia's Coolbits registry tweak/hack has been claimed to actually work better than Powerstrip/Rivatuner/etc. with identical overclock settings !! I can not confirm this but can confirm that Coolbits has worked very reliably for my GF2 MX and my new GF4 Ti4200. All it does is add a new tab - Clock Settings - on the Display Settings window where you adjust D3D and OpenGL settings. If you do a search online you can download Coolbits as a small file which you then double-click to install into the registry and onto Display Settings.
As far as overclocking goes, well, the card will run HOTTER and that will SHORTEN ITS LIFE, but hey we all love high benchmark scores (me too) so go for it if long-term stability/replacement cost doesn't worry you. The real use for overclocking is to get demanding 3D games to run smoother, without being jerky/choppy/hiccuppy (is that a real word?). With your older GF2 card this could really help with some games. But where is the advantage of playing a game at 120 frames per second (when overclocked) versus 104 frames per second (not overclocked)?? If you can't tell any difference in gameplay quality then why cook the card ? Alternatively, maybe you want that game to run with 4X Anti-Aliasing and at a higher resolution....whoa! your framerate just dropped from 104 fps down to about 50 fps (or whatever). NOW an overclock setting could maybe increase the framerate to a smoother, more playable 60 or 62 fps (or whatever). My new GF4 Ti4200 128MB is a Gainward Golden Sample card. I have ONE GAME where using Gainward's "enhanced" setting (520 MHz memory clock instead of standard 444MHz)does improve gameplay on my P4 1.6GHz Dell; otherwise I stick with "standard" settings. Gainwards own software "window" also has a cool digital zoom feature to enlarge any desired screen area - nice feature for certain situations.....

Regards
 

starvinmarvin

Distinguished
Jun 17, 2002
90
1
18,665
There is a nifty little program called FRAPS17.exe or simply FRAPS (search online for free download, or I can send it to you) which displays FramesPerSecond while you are playing a game. You can place it in any corner of the screen and check frame rates in most any game. V. cool !!

Regards
 

kai_kai

Distinguished
Mar 29, 2002
40
0
18,530
ok i oc'd my cpu to 2.4ghz now it runs rock solid, very stable i think. but overclockin the vid card (gf2 ti4200) makes the screen or wutever choppy?. so in this case, i have now made it run @ default speed 250/444. my prob is, havin a system w/ cpu @ 2.4Ghz as well as a geforce 4 ti4200 only scored 7272 on 3dmark 2001se -----> im really pissed off, frankly. cuz when i ran the 3d accelerator comparison, i see the average for the 2.4ghz system w/ a geforce 4 ti4200 is 11200 sth!!! it's like 4000 pts diff. how mi not pissed off to death?!? *calm down* ok wut i did was install the 29.42 driver. thats it. and i didnt even touch the display or wutever settings. oh yes i did enable 2 samples of antialiasin be4 benchmarkin.
http://service.madonion.com/compare?2k1=3768335
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
AA really puts down the performance. Take off AA and see what happens cause the default is with no AA. So the scores you're comparing against probably have no AA.

What's the deal with lampshades, I mean it's a lamp, why would you want a shade? :smile:
 

starvinmarvin

Distinguished
Jun 17, 2002
90
1
18,665
Yup, got it, thanks ! great little prog, that FRAPS. It's great for checkin' out performance with various game settings so you can find the best setup, or for troubleshooting.
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
LMAO!

<b>Question:</b> Do you have a link for FRAPS?

<b>Answer:</b> Yup, got it, thanks !

That's what I get for being lazy!

For everyone else FRAPS can be downloaded at <A HREF="http://www.fraps.com" target="_new">WWW.FRAPS.COM</A>.

<b>I have so many cookies I now have a FAT problem!</b>
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
Well I downloaded the program and I tried running Counter-Strike on the new video card and it constantly stays at a stable 74FPS at 1024x768x32. Probably have my refresh rate at 75Hz? Do you know how to raise it? The specs of my monitor (Trinitron 19") say it can go higher than that...

Unfortunately I suck crap at the game and I've died after a few mins all the time... It's a toughie.

What's the deal with lampshades, I mean it's a lamp, why would you want a shade? :smile:
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I don't play online games (not much fun with a dial-up connection). Isn't 74 fps pretty good for Counter-Strike?

Windows uses the inf file for your monitor to get it's characteristics. The monitor should have shipped with a disk that has the most recent inf file which might be better than any Windows has on file. If you haven't already installed the inf file you should.

You can also try PowerStrip which reads the PnP information directly from the monitor and sometimes finds extra capabilities. It's worth a shot.

If neither of the above helps and you are sure that the monitor has the extra capability, you can fool Windows by selecting a different monitor in Display Properties. Choose one that has a higher refresh rate at the desired resolution. I don't think Windows reboots when you change monitors if you are prompted don't reboot. Confirm that the new refresh rate actually works. If it does you will want to make sure that Windows doesn't reselect your real monitor on the next restart. To do this locate the option to disable auto detection of Plug 'n Play monitors somewhere in Display Properties. Windows will now assume the new monitor is real and let you change properties accordingly.

Keep in mind, if you choose to fool Windows in this way be sure the monitor can actually do it. <b>Using the wrong refresh rate can damage the monitor</b>.

Lastly, you can actually edit the .inf file for your monitor so that the file includes the correct capabilities of the monitor. I've never tried this but I imagine you can cut and paste the information from another monitor's .inf file into yours.


<b>I have so many cookies I now have a FAT problem!</b>
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
Yeah I went into PowerStrip and it said my refresh rates were 75Hz, so I put it up to 85Hz, cause my monitor can take it. THanks for the help! I'll see if I can do 85FPS now.

What's the deal with lampshades, I mean it's a lamp, why would you want a shade? :smile:
 

baldurga

Distinguished
Feb 14, 2002
727
0
18,980
Diasble Vsync in Directx and OpenGL. Otherwise the monitor refresh limits your graphic card. This is also valid when running 3DMark2001 if I am not wrong.

Note: some strange effects can happen when vsync is disabled. In this case just disable for benchmark purposes and enable again when playing.

DIY: read, buy, test, learn, reward yourself!
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
hey, the reason with FRAPS when u run CS that it says u get a constant 74/75 fps is because VSync is on. Vsync matches every frame with ur monitors refresh rate, ur monitor is obviously refreshing at 75 hz, use ritatuner, or just nvidias control panel to force Vsync off in open gl or d3d, whichever u r using for CS, for any half life game open gl is always faster.

The first LAN party I went to was at a PETA convention. They booted me when I shot a crab in HL!
 

HonestJhon

Distinguished
Apr 29, 2001
2,334
0
19,780
the weird effect is called tearing...
it appears that the screen is tearing.
most of the time you will barely if at all notice it...
so it isnt really an issue to most people.
i always have vsync disabled... :smile:


-DAvid

-Live, Learn, then build your own computer!-
 

Matisaro

Splendid
Mar 23, 2001
6,737
0
25,780
Ti4600 levels unless you use some sort of cooling other than air. I've seen Ti4200 perform better than a Ti4400, but not quite to Ti4600 levels. Use Powerstrip or RivaTuner, two great overclocking utilities.

Most ti4200s can get their cores easily to 4600 levels, the ram is the issue, my 4200 is running at 330/580 which is 30 mhz more core than a 6400 and 70mhz less ram. It averages slightly under a 4600 in benchmarks.

:wink: The Cash Left In My Pocket,The BEST Benchmark :wink:
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
Exactly, it's not quite up to Ti4600 levels - wait I should rephrase that. It's not quite up to Ti4600 performance levels. There. That's a lot clearer.

:smile: Falling down stairs saves time :smile:
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
Exactly, it's not quite up to Ti4600 levels - wait I should rephrase that. It's not quite up to Ti4600 performance levels. There. That's a lot clearer.
Still, pretty d*mn good at only 60% the cost of a ti4600

<b>I have so many cookies I now have a FAT problem!</b><P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 06/30/02 09:40 PM.</EM></FONT></P>