I used to work at Sun in the 90's and sold lots of engineering workstations for $30,000+. I've been looking for some common stats between today's GPU and the metrics we used back in the 'dark ages' of 100Mhz CPUs and 4MB Graphics cards.
I'm trying to answer a general question as to how much faster a ~$500 card is today vs a top-shelf system. We looked at Vectors/Sec, Triangles/Sec - but those don't seem to be tested or measured any more...
I'm trying to answer a general question as to how much faster a ~$500 card is today vs a top-shelf system. We looked at Vectors/Sec, Triangles/Sec - but those don't seem to be tested or measured any more...