That last bit of Nvidia fanboyism wasn't helpful, mousemonkey. At least ATI is honest and doesn't fudge drivers the way Nvidia did with Crysis (i.e. not displaying the water the way it's meant to be played just to get a few extra fps).
Quanger, benchmarks can be tricky guides and results will vary. It depends on what CPU you have as well as what GPU. Note that most of the benchmarks by Tom's, Anandtech etc. are done using quad cores. Sometimes it's the Q6600, but more recently the latest Intel EE.
Benchmarks are averages anyways when FRAPS is used. It can vary from session to session and Tom's and Anandtech rarely do averages from several sessions. Though H tries to do "real world" game tests, they often pick better drivers and settings for their favored Nvidia cards over ATI.
I'm CPU limited until I get a new LCD monitor this Friday, but I still get up to 60 fps at 1024 x 768 in The Witcher, but it drops down to 19 fps in some intense combats, but it averages 30-40 . Anandtech calculated 46 fps in The Witcher using the first in game cut scene at 1680 x 1050. I've noticed that I get more fps in cutscenes than I do in gameplay where there are quite a few monsters in a group.
That shows me that Crossfire's recognized in that game, whereas with LOTR online, I get 42 fps max can can drop to 20 fps in DX10 mode with Vista. When I get to play on the new 20" LCD then I won't be as CPU limited. When I get a Phenom 9750 in May on a Crossfire board then I definitely won't be CPU limited.
As for AF, I found that I could only get 20 or so fps in LOTR online with 16x AF, but it looks just as good with very sharp detail. So, if you don't like less than 50fps, cut down on the AF. The next generation of ATI cards are supposed to do better with AA and AF. Lastly, some games just don't recognize Crossfire or SLI.
So, you are probably doing okay because you don't have a quad core and you are getting real world gameplay. Note that your Athlon X2, while better than mine, is still a weak CPU by today's standards. If you can get a decent average fps with everything enabled at 1680 x 1050, then what's to complain?