How are people noobs? Everyone knows that 3DMark06 is CPU reliant, so I linked to a review of his CPU and others in the same market niche from Intel and AMD. That provides real world data that's not synthetic.
I should have mentioned that resolution affects CPU bottlenecks. At higher resolutions, the CPU can keep up and the GPU reaches it's full potential. The 9600GSO is a weak card compared to an 8800gt, but it's not all that bad for games. It's not like it's an 8400gs or 3650.
It has 96 shaders, better than the 9600gt's 64 but not as good as the 8800gt's 112:
http://www.guru3d.com/article/geforce-9600-gso-386-mb-review-point-of-view/2
These benchmarks aren't that bad:
http://www.guru3d.com/article/geforce-9600-gso-386-mb-review-point-of-view/8
It seems playable up to resolutions of 1600 x 1200. I'd say his Pentium e2220 is a weaker CPU than his card is a weaker budget gaming card.
Look at World in Conflict. Not too bad either:
http://www.guru3d.com/article/geforce-9600-gso-386-mb-review-point-of-view/10
50 fps at 1600 x 1200.
3DMark06 is 8746 in this round of test. Note that they used an X6300 Extreme Conroe core CPU, so we see the card can do better with a better CPU.
He can probably overclock to get better performance, or go to a higher resolution if he doesn't want to overclock his Allendale beyond the stock 2.4. Note that he didn't mention what resolution he games at, though 3DMark06's stock resolution is 1280 x 1024.