That's exactly what i was thinking. Is like the system hit a speed bump at that resolution for a nano second and screw stuff up. I have not idea if they tried, but maybe they should have run that benchmark twice or a few more times. I am not implying that Intel should have won, is just add, that at higher res AMD takes more ground and wins one, but then the next higher res it looses?
IMC being more about latency than bandwidth sounds like a very likely cause... but if you compare the results on the 8800 GTS, the tables turn so I suspect it may have more to do with the tricky nature of benchmarking Oblivion or other margin of error factors than anything particularly telling.
IMO, the article actually supports the old adage that the CPU isn't very important for gaming. While the frame rates at 1024x768 shot up on the Conroe, they were all well over 100 anyway... except in the one test where FX-60 was able to keep up. I haven't scanned all of the results thoroughly, but I doubt there's any case where upgrading to the Conroe affected the playability of the game.
This doesn't totally shoot down CPUs though. There is the belief (that I personally agree with) that while CPUs may not be that important for average frame rates, they are critical for "smooth" gameplay. I'd be very curious to see the results with only minimum frame rates displayed. Also, even with averages shown, the FX-60/8800GTX/outdoor Oblivion test was clearly CPU limited at 1024x768 without clearing the 60 fps mark. This suggests that CPU usage isn't that far behind GPU usage and that much of the next generation of games will take advantage of a Conroe.
EDIT: All that said, I'm very happy with my 30% overclocked 4200+ and 8800 GTX combo and feel no need to upgrade my CPU.