We're not really talking 'optimizations' here, there's no framerate difference, it's just how each driver is rasterizing the image.
For the most part, the rasterizers will deliver an image close to reference, but in some cases certain games will rasterize differently on ATI/Nvidia machines. You can see it in screenshots, some games have a remarkable difference.
Serious Sam 2 & Oblivion are examples of this. The resulting images are very different, but it's hard to say which is 'right' or 'wrong' unless you have a DirectX reference rasterized image to compare it with.