/ Sign-up
Your question

Good Score in 3D Mark 2005?

  • CPUs
  • Drivers
Last response: in CPUs
September 3, 2005 9:06:58 PM

What do you all think a good score for this system would be for WinMark 2005? All of the latest drivers are loaded for the Chipset and Vid Card. No overclocking, all box setting. Just ran 3D Mark in all of its stock configuration. Oh yeah, only ONE video card, not SLI.

I ran it last night and only got like 7100. Seems low. If so maybe something is not right with my drivers or something.

2 GB (1GB x2)CORSAIR VALUE SELECT PC3200 set up as dual channel

<P ID="edit"><FONT SIZE=-1><EM>Edited by butthead on 09/04/05 06:56 AM.</EM></FONT></P>

More about : good score mark 2005

September 4, 2005 12:10:56 AM

I'm not sure about that banchie - Try 3D mark of something, perhaps Sandra.

(='.'=) <A HREF="" target="_new">Welcome to the House of Horrors, welcome to the House of a 1000 Corpses</A>
September 4, 2005 2:55:59 AM

Winmark is 3D mark, as in Winmark 3D 2005, sorry about that confusion.

So what you think?
Related resources
September 4, 2005 5:25:57 AM

Thats low u should be getting close to 8000 with that setup.

The know-most-of-it-all formally known as BOBSHACK
September 4, 2005 5:28:07 AM

Are u running 2x1gig sticks or 4x512 sticks? If your running 4x512 then your RAM might be reverting down to ddr333 or ddr4002t which could be limiting your score. U might also wanna check to see if AA or AF are on in the nvidia control panel.

The know-most-of-it-all formally known as BOBSHACK
September 4, 2005 6:52:16 AM

AF and AA are both set to application controlled.
I'm running 2 GB of Corsair Value Select RAM in 1 MB sticks in dual channel.
I can use the 3DMark 2005 profile. Would that help?

<P ID="edit"><FONT SIZE=-1><EM>Edited by butthead on 09/04/05 06:55 AM.</EM></FONT></P>
September 4, 2005 9:13:42 AM

He just told you. Check your ram settings.

<b>It's a man's obligation to stick his boneration in a women's separation; this sort of penetration will increase the population of the younger generation.</b>
September 5, 2005 3:33:45 AM

Most people who are going for high bungholio marks, turn AA and AF off, and video settings to lowest possible.
September 6, 2005 7:30:30 PM

That poor (and yet overly expensive) CPU, being kicked in the nads by value RAM. Poor poor FX57. Maybe the fairy godbunny will give your RAM some heatspreaders and OC the timings.<pre><font color=blue> ∩_∩
(=¥=)</font color=blue> - Cedrik says be kind to your expensive CPU. Don't buy $#!77^ RAM unless you OC it.<font color=blue>
_Ū˘Ū_</font color=blue></pre><p>
:evil:  یί∫υєг ρђœŋίχ :evil: 
<font color=red><i>Deal with the Devil. He buys in bulk.</i></font color=red>
@ 197K of 200K!
September 7, 2005 3:47:57 AM

ive only gotten 3500 with my 3400+ and 6800LE
September 7, 2005 7:44:08 AM

Well, I talked to a guy who has my exact MB and CPU but he has some really good OCZ RAM, same as my config 2x 1GB sticks, but he has also the BFG GTX. He said he barely got to 7000 also. Then he timed his RAM to factory specs, instead of auto, and turned off PEG mode in the BIOS. Said he hit over 8000 last night.

I think RAM is the problem too. That problem is going to be behind me in about another 12 hours. I called my supplier, and the CS let me swap the FX57 for the X2 4800 and the Corsair Value Select RAM for the CORSAIR TWINX2048-3200C2PRO 2GB KIT (1GB x 2) 400MHZ MATCHED PAIR CL2 184-PIN DDR DIMM W/ACTIVITY LEDS & HEAT SPREADER. That Ram was 320.00 for the pair compared to the Value Select at 220. The best thing is that there was no restock charge of 15%, and I came out 100.00 ahead because of that stupid over priced FX57 that cost me 1095.00 compared to the X2 4800 at 866.00-the FX really does scream when you use one program though, I'll admit.

So tomorrow I'm gonna time the RAM to factory specs and install the new X2 4800 and the new RAM. Then if I don't like the score, I'll return the BFG 7800GTOC and get the GTX with the 100 I saved. I'll probably do that anyway.

New System as of the next time I write:

ASUS A8N Premium
Corsair PRO matched pair 2 GB RAM
X2 4800

Loaded for elephant this time, not pansy bear.

As for 3D Mark Scores, I could really care less. I just use them to make sure my hardware, all things being equal, is running like it should.
September 7, 2005 8:27:53 AM

Just so you know, 3D/05 is a very gpu intensive prog. Changing to the slower cpu should not affect your score, except in the cpu score.
What I have always found odd, is that Amd chips usually win the cpu score, but loose the 3D score. So, if Amd chips are worse for 3D, why do they do better for gaming?
I think of the score in number of trash cans. At the end of the day, you dont really know how full any of those cans are, or how much they stink.
September 7, 2005 11:08:55 AM

Couldn't have put it better :D 

<i>I pretend to know what I'm talking about.</i>
September 7, 2005 3:07:14 PM

<pre><font color=green> ∩_∩
(=¥=)</font color=green> - Cedrik says that only you can prevent stinking trash cans.<font color=green>
_Ū˘Ū_</font color=green>   And every environmentalist knows to go for the <A HREF="" target="_new"><b><font color=green>green</font color=green></b></A>!</pre><p> :evil:  یί∫υєг ρђœŋίχ :evil: 
<font color=red><i>Deal with the Devil. He buys in bulk.</i></font color=red>
@ 197K of 200K!
September 7, 2005 8:45:01 PM

You've thought way too much about this...


AMD: [64 3000+ (down)][2500+][2400+][2000+][1.3][366]
Intel: [X 3.0x3][X 2.8x2][P4 3.0x2][P4-M 2.4][P4 1.4x5][P4 1.3]

"...and i'm not gay" RX8 -Greatest Quote of ALL Time
September 8, 2005 3:05:43 AM

OK, well I didn't do more extensive testing with the FX57, but oh well. What I did was to install the new X2 4800 with the new Corsaid RAM, TWINX2048-3200C2PRO, a 2 GByte matched memory pair, consisting of two CMX1024-3200C2PRO memory modules at DDR400 (200 MHz, 2-3-3). Remeber I had the Corsair Value Select before. I left the latency at AUTO in the BIOS. So everything is the same except the processor and RAM.

My average score running 3D Mark for both processors was as follows:

FX57: 7050
X2 4800: 6880

So like the guy said above, 3D Mark is very "GPU" intensive, and we see what he expected right here. Doing a little basic algebra, we have a 2.5% drop in performance from the 57 to the X2. Whoopie doo. Considering I can now create a 4 GB image while I play Half Life or work in Photoshop, I'm ok with that.

However, I'm beginning to think this BFG 7800GTOC is not nearly the card the GTX is. I called CS today, they said send it back, no restock, and we'll send you out the GTX.

So as long as my CC gets charged back the FX 57 and other stuff correctly, I'm happy as Makaveli was when he hit puburty last week. Wusy hasn't hit it yet :( 

So hey, why don't you guys try this: Run RAR hammering a 1GB file, and open your favorite game and try playing. I'll take two FX53 CPUs with the new FX/X2 chip archetecture any day over the FX57, especially since it's 240.00 dollars cheaper.

But I can tell you something. When my anti spyware and virus software starts scanning when I start Windows, I can see a difference in how long it takes for SpyBot to complete it's scan. Not a huge difference, but I can tell. The FX57 was faster with that. It's also faster in Win RAR. But that will all change when softwware developers makes the change to true multithreading.
September 8, 2005 7:28:34 AM

Its at 16pipes 5 vertex
300mhz core and 700mhz memory
I have crappy memory but I dont really care becuase my last card was an fx5200 and gets blown away.
a b à CPUs
September 8, 2005 9:30:35 AM

Too bad you're in Zee land, because I'm sure you'd be interested in finding the capabilities of the P4 engineering sample I'm selling in the Classifieds section.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
September 8, 2005 1:12:33 PM

But that will all change when softwware developers makes the change to true multithreading.

Sorry to break it to you, but that's <i>really</i> not going to change any time soon. (If it ever changes at all.)

Multithreading a program takes a hell of a lot of work to keep the bugs out, requires a crap load of differing systems to test for timing errors, makes code a hell of a lot harder to debug and maintain, and so far has an extremely small target market. Other than the few software developers that actually can see an advantage in all of this time, money, and effort in a significant portion of their target market, no one is bothering.

I'd say that you have at <i>least</i> two more years to go before multithreading software is even remotely common.<pre><font color=red> ∩_∩
(=¥=)</font color=red> - Cedrik says and even then it'll be by dragging most<font color=red>
_Ū˘Ū_</font color=red>   programmers in kicking and screaming.</pre><p> :evil:  یί∫υєг ρђœŋίχ :evil: 
<font color=red><i>Deal with the Devil. He buys in bulk.</i></font color=red>
@ 197K of 200K!
September 8, 2005 5:05:19 PM

What's that got to do with your 8500LE?

where did you get 8500 from?
i have a 6800LE
September 8, 2005 8:54:21 PM

Some are already multithreaded, such as Adobe Photoshop. But from what I have read, you are right. But I have also read that linear multithreading is not as hard as more complex multithreading to code and debug, and I think this is what Adobe does, but I'm not sure about that. On the toehr hand, if multithreading makes a product faster and takes advantage of Dual Core CPUs, then competition between companies will dictate that approach. Some form of multithreading is forthcoming, although it may be a couple of years as you state.

In any event, the dual core IS the wave of the future regardless of multithreade apps. No wonder too. I mean I can do all that I've stated above and have no slowdowns.

With a standard test using Radial Blur in Adobe PSCS with a standard test image, for instance, I can do it in 40 seconds average while the best I could do with the FX57 was 64. Since I have OnBoard sound, I tested both with WinAmp running and playing music. The FX went to 149 seconds while the X2 stayed at 40. It seems like the dual core should be more money than the single core, then I could see buying single core perhaps. But the X2 4800 was, as I have stated. 240 or so less. I'm really happy with it. It also allows you to think differently when working with both sound and video files, since I can have, for example, Acid PRo open and converting a wav file and be working in Adobe Photoshop at the same time with no slow downs. Great time saver. If you ahve ever worked in a sound lab of sorts, you know how much time applying filters can take, expecially if other programs are using clock cycles. Two FX53s are better than one FX57.
September 9, 2005 1:40:44 PM

Of course anyone could just use CTRL+SHIFT+ESC to set process priorities so that a single CPU rig runs things smoothly. Multiple cores (or even just Intel's HT) does have it's advantages. (It's one of the reasons my main PC is a NWC.) Raw performance just isn't one of them. To each their own I guess.

Only time will tell how well programmers will take to multithreading. Dual processor boxes have existed forever, and they didn't change anything. Intel has had hyperthreaded P4s for a good while now, and that didn't change anything either. Will dual core boxes change much? We'll see. I'm doubting it though. Not for years and years anyway.

:evil:  یί∫υєг ρђœŋίχ :evil: 
<font color=red><i>Deal with the Devil. He buys in bulk.</i></font color=red>
@ 197K of 200K!
a b à CPUs
a b \ Driver
September 12, 2005 4:15:44 AM

Your score sounds like you're throttling to the standard 3D speeds....what are your loaded temps? If those are alright, can you take a SS of your advanced settings in Rivatuner and post it?

No thanks.
September 12, 2005 8:58:10 AM

OK thanks for responding to this question. I'm not sure what you want me to do here. I installed RivaTuner, but tehre is no "advanced" setting. There is a "performance user" but that area is all default and nothing to report.

Also, I'm not trying to OC anything here. All I'm trying to figure out is if my system is running like it should for stock speeds using a X2 4800 CPU and a BFG 7800GT OC card.
September 12, 2005 1:36:54 PM

"Multithreading a program takes a hell of a lot of work to keep the bugs out"

Not to mention it does not benefit to make word multithreading. It won't be common ever - it just doesn't make logical sense.

<A HREF="" target="_new"> My Rig </A>
September 12, 2005 4:37:42 PM

Not to mention it does not benefit to make word multithreading. It won't be common ever - it just doesn't make logical sense.

You chose a <i>really</i> bad example here. Just off the top of my head the spelling and grammar check in a seperate thread can make a <i>huge</i> difference when importing a document. (Especially when importing novels.)

Your argument only holds up <i>if</i> the program is just constantly writing down your key presses. But as programs like Word incorporate more and more realtime editing features, they can definately use more processing power <i>and</i> more threads.

Now something light, like Notepad, <i>that's</i> not worth multithreading.

:evil:  یί∫υєг ρђœŋίχ :evil: 
<font color=red><i>Deal with the Devil. He buys in bulk.</i></font color=red>
@ 197K of 200K!
a b à CPUs
a b \ Driver
September 12, 2005 7:14:55 PM

Ahh, yeah,'s actually in the nV panel (not on RivaTuner, that can be used for LOD tweaking--which destroys IQ but will give you a point boost [not recommended]).

Desktop (right click) -> settings tab -> advanced -> GeForce 7800GTX tab -> performance and quality settings -> View (change to Advanced settings) -> please list these from top to need to be detailed (mine are 4x, 16x, HQ, NA, off, trilinear, on, off, single-display mode, off, off, off, off, supersampling, on, allow). Those settings are for the utmost in IQ, changing High Quality to High performance and turning off AA and AF should get you up and around 7300 (my stock score on a 4400+ and a BFG 7800GTX).

No thanks.
September 13, 2005 1:08:31 AM

As I said before, your scores are typical of your system, when default settings are used. Most comparisons are done with settings @ performance, not default.
September 13, 2005 5:56:44 AM

Ah ok. Yeah it sounds about right since the above user is getting 7300 with a GTX.
September 13, 2005 6:10:07 AM

Yeha I know where that is. I thought you meant that--lol. Yep everything is pretty much as yours is except I have my slider set to teh default setting which is Quality, and I have a GT not a GTX. But I'm swapping that tomorrrw for the BFG GTXOC.

AA= application
Anis. Filt. = application
Image Setting = Quality
Color Profile = none
Vert Sync = application
Force Mipmaps = none
CTC = on
Ext. Limit = off
Hardware Acceleration = Single
Trilinear = on
Anis. mip filter = off
Anis. Sample Opt. = on
Gamma Correct AA = off
Transp. AA = off
Triple Buff. = off
Neg. LOD Bias = Allow
a b à CPUs
a b \ Driver
September 16, 2005 2:25:56 PM

You should have low 7k with the GT and mid-high 7k with the GTXOC with those settings....

No thanks.