I recently bought a 5830, fulfils all my gaming needs and was pretty cheap for the performance
My question is...5830 is pretty much just a cut down version of 5850 right? and the bottleneck that causes most of the performance difference as said in many reviews is because the number of ROPs is cut down in pretty much half, cutting down a huge chunk of performance (those bastards!)
ROPs from whatever I read and could gather, are the back ends, a sort of pixel pipeline dealing AA, resolution and that kinda stuff. I game/work on an ancient monitor with max resolution 1200X1028 (or whatever those numbers exactly are)
I dont mind reducing the resolution even further, nor am i a big fan of AA.
So on lower resolutions, would 5850 and 5830 perform closer than at higher resolutions at same detail settings? Since there're less pixels to render and hence the bottleneck (ROPs) shouldn't be the limiting factor anymore, rather the stream cores should be because they're the ones who work on image/detail quality? or am i missing some other role ROPs play?
thank you! 1280X1024....I always get confused when it comes to resolutions, I think I'm numerically dyslexic or something lol
anyways, yes, that was what I was talking about! so basically the entire purpose of ROPs is to throw pixels on the screen?
In that sense, graphics cards with higher ROP count would make sense for only those who game at higher resolution!
I'd also appreciate if someone would throw some light on HOW exactly does ROPs contribute when it comes to AA? I've searched and looked around but wiki and most sites I found have a very limited information about the subject.
Even links would be very much appreciated. Cheers!