Version 1.0.0.0, DirectX 11, 90-sec. FRAPS "Going Hunting" Test Set 1: Medium Quality Preset, No AA, 4X AF, SSAO Test Set 2: Ultra Quality Preset, 4X MSAA, 16X AF, HBAO
DiRT 3
Version 1.2, Direct X 11, Built-in Benchmark Test Set 1: High Quality, No AA Test Set 2: Ultra Quality, 8x AA
Elders Scroll V: Skyrim
Version 1.5.26.05, 25-Sec. FRAPS Test Set 1: High Preset, No AA, 8x AF, FXAA Enabled Test Set 2: Ultra Preset, 8x AA, 16x AF, FXAA Enabled
StarCraft II
Version 1.4.3.21029, Toms Hardware Map, 60-sec. FRAPS Test Set 1: High, DX11, AAA, 4x AF, No DoF, No PhysX Test Set 2: Very High, DX11, 4x MSAA, 16x AF, DoF, No PhysX
why are you not increasing the voltages on the GPU to get more clocks ?
any enthusiast with limited budget would want to maximize his core clocks with higher voltages.. the card can keep cool by increasing the fan speed.
More noise for a gaming session is acceptable.
Dumping the bulk of our funding into graphics is sure to spell disaster throughout the media encoding and productivity benchmarks. But it's time to face the music.
Slomo4shOWould have liked to see Diablo 3 and SC 2 benchmarks for this build.I can't give you exact fps rates, but my machine is very similar to this one (only difference is the GPU: 6950+Z68) and I get similar frame rates in all the tested games. So I'll infer to you what this rig would probably get close to.
Diablo 3 maxes out at 60fps with occasional dips down to ~30fps when getting mobbed on hell. As for SC2, frame rates for me tended to be around 35fps on average with everything maxed out at 1920x1080 for both games.
s3anisterCeleron G530 is what I'm rocking in my gaming rig. It is definitely a capable processor, surprising given the legacy behind anything labeled Celeron.Ah, but think way back.... slot 1, 440BX, and the Celeron 300A? I had a 266@412MHz, a 300A@464MHz, a 300A@450MHz, and a 333(that topped out down at an 83 MHz FSB).
While not the first chips I had overclocked, those slot 1 Celeron's gave me the incurable OC bug! *dreams of G530K*
Amazing! I never thought an SBM machine would ever come this close to my own rig. And confirm for me that my drooling over $200+ graphics cards is not an impractical fantasy for my current rig. I've been dreaming of retiring my old GT240 for a newer card and had the HD7850 (or comparable Nvidia counterpart when it comes out) in mind, or even an HD7770. I now feel justified and my wife will go nuts over the pc part purchase, again.
I did notice one thing when I compared this build with my system - mine idles at 48-52 watts, too, and I use a 500W S12II. I think right-sizing the PSU will add to the efficiency. A 350w PSU is my bet for bringing the idle power draw closer to the 20% mark of the PSU rating where efficiency starts to pick up (as per 80plus requirements). I say 350w because whoever gets this will likely want to upgrade the CPU to something beefier sooner or later. Nah, sooner!
Thanks, Paul! for tackling love and system-building with reckless abandon.
i got a question. if I were to use a phenom 2x4 965 BE(3.4ghz) for a gaming rig on a similar budget to this, would it bottleneck me in gaming and other applications?
mayankleoboy1why are you not increasing the voltages on the GPU to get more clocks ?any enthusiast with limited budget would want to maximize his core clocks with higher voltages.. the card can keep cool by increasing the fan speed.More noise for a gaming session is acceptable.As mentioned, it was maiinly a matter of consistency with the past few builds. Dealing with fixed CPU clocks and memory frequency, I just haven't been too aggresive with previous efforts with Radeons, and thus didn't want to boost voltage here with the GTX. Trying to play it fair, that's all. Maybe once we revisit overclockable platforms, and are already dealing with increaded noise, I'll get itchy to max-out the GPU.
Thanks for the feedback though. I'm actually surprised given the balance of the system, that people would desire to see aggressive GPU overclocking.