Difference between 65 and 55nm core in Geforce 9800M GTS

Ultimate_Destructo

Distinguished
Jan 22, 2010
108
0
18,690
I read that the 512MB 9800M has a 65nm core, but to keep the 1024MB 9800M's TDP at 75W, Nvidia reduced the core to 55nm, because apparently the memory produces a good amount of heat. This has gotta make a difference for overclocking. I would assume that if heat or power were an issue, the 55nm one could have the core and shaders overclocked more than the 65nm one, but the 65nm one could have the memory overclocked more.

As for me, I am using the stock 120W PSU and can't do much overclocking, so should I just overclock the core and shader slightly but not the memory? (I have the 55nm one.)
 
LOL that psu is s^it my old pentium 1 box has better and I would be surprised if any one really bothers to answer this thread. The only difference is power consumption for I own both 65nm and 55nm boards. One uses around 10-15w less than the other and has a 20mhz advantage at max clocks.
 

notty22

Distinguished


Why ? xd ...Why are you trying to run a video card so out of recommended specifications. Why are you asking about theoretical o/c situations when you don't have the correct psu for even normal operation ?
 

Ultimate_Destructo

Distinguished
Jan 22, 2010
108
0
18,690


Lol. I undervolted my CPU which gives me the extra power I need to do minor overclocking. Also earlier laptops in the same series used the 65nm core card and had the same PSU. I am also planning on buying a 150W or 180W PSU eventually to replace mine when I upgrade to the Core 2 Duo T9900 which uses 10 more W, so I don't care if I'm pushing my current PSU. I also feel that laptop graphics cards can be overclocked more than desktop ones if you have the proper cooling, because with laptops, they have to assume that you don't.