Using Powerstrip I was able to play a bit with overclocking the MAXX but ran into some interesting issues. After a few minutes of testing, it seemed to run into frame-syncing problems no matter what setting I used. The memory seemed to deal with the overclocking just fine but the core speed adjustment caused this unfortunate problem. I guess we'll have to see what ATI does to fix this problem. Although they don't condone overclocking, I'm sure they'll not want to upset their customers who do want to push the limits of their card. I'll have to talk with them about this and get back to you. Who knows, maybe ATI will release a driver with the overclock utility that adjusts everything necessary to keep things "in sync" while overclocking. Of course there's another possible explanation for this problem. It could be that Powerstrip is only overclocking one chip, instead of both. Maybe Ashley, the magical man behind the great Powerstrip, can shed some light into this issue for us as well.
I plan to see some pretty kick butt numbers when it comes to any of tests in the higher resolutions due to the raw fill-rate power of the MAXX. Unfortunately the performance at low resolutions won't be seen until ATI has a little more time to optimize their driver. High color, high-resolution settings will be the area where I expect the MAXX to pull the rug from under the GeForce SDR. In theory the DDR GeForce should fall to the MAXX but my gut feeling is that it won't. One last note is that we need to keep in mind that in T&L tests, the MAXX will only do as well as the CPU it's coupled with. In our tests we still stick with a very common CPU, the PIII 550 that won't fair well against the ability of the dedicated T&L engine in the GeForce cards. Let's put the MAXX to the test.