I recently purchased a new system, including a Sapphire Radeon X1900XT 256 MB, E6300, Gigabyte GA-965P-DS3, WD 2 GB RAM 667 MHz. It "works" fine so to speak, cooling is OK, and I score a decent (I think?) 5050 on 3DMark06 on default resolution. However, Everest shows GPU clock is 500MHz and RAMDAC clock is 400MHz; the newest Catalyst shows 500 MHz core clock speed, and memory clock at 594 MHz. Not only do these two show different speeds, but these are the wrong speeds for the card itself (should be 625 and 1450). Other programs, such as ATITools, for example, don't even display the clock speeds, by the way.
I hope someone could help me figure out what the hell is wrong... thanks.
Not always. I used an on-screen display while playing for a couple days and not once did the clock speeds go up to 625/1440 in game. I actually had to go into tray tools and set the auto overclocking feature myself so it would finally run at the correct clocks. Upped my fps and 3Dmark score right after. I don't know if it's just my card or Omega drivers I used.
Well if I already posted a topic... It's probably a very annoying question but is 5050 a good score? (E6300 / 2 ram, no overclock) And, in any event, how can I make sure I get proper clockspeeds in 3D applications? Cuz Far Cry doesn't perform too well (sometimes drops to 30-40 fps on 1280x1024 with 4xAA and all settings set to max - or perhaps that's what I should get?)