Sign in with
Sign up | Sign in
Your question
Solved

Nvidia Geforce 240 GT Overclocking

Last response: in Overclocking
Share
October 8, 2010 10:18:33 AM

Note: My posts irritate some people so if u are A High Blood Pressure patient don't read ... :non: 

Hi friends,
Just two weeks that i had got a new Nvidia Geforce GT240 (DDR3/GDDR3) with 1GB VRAM. Its vendor is MSI if you want to figure out more this is its model number as its box depicts 'MSI VN240GT MD1G'...

Now On Point.........
I got it for Rs.5500/50 = $120. It was going fine with my Pentium Dual Core E5300 2.6 (200*13) Ghz OC'ed to 3.43 Ghz (266*13)......... its performance with latest titles (excluding DX11 titles since it only supports DX 10.1) at high settings with resolution till 1280*1024 is OK (it even get 8832 3D 06 Marks at 1280*1024).... but gets down to ground on performance at 1600*1050..........
U guys must be wandering now What The Hell i Want........So i must tell you:
1.I want to OC this GPU..... if anyone had personal experience with this GPU please mention.
2. I OC'ed it to follwing settings:
Core Clock From 550 Mhz to 575 Mhz
Shader Clock From 1340 Mhz to 1500 Mhz (till 1600 Mhz but unstalbe)
Memory Clock From 790*2 to 900*2= 1800 Mhz ..................
3.But I referred wikipedia for stock setting of my card they where :
For DDR3 1800 Mhz Memory Clock (I Oc'd to same)
For GDDR3 2000 Mhz
For GDDR5 i don't remember but where over 3200 Mhz
Other GPU and Shader settings where same.....

Now i wanna know what is the difference between DDR3 & GDDR 3... :ouch: 
And one more thing the recommended OC settings for this cards and last thing : are you fine after reading this essay........? :sol: 
Plz Help,,
Krishan
a c 125 U Graphics card
a c 100 K Overclocking
October 8, 2010 3:38:29 PM

I have an EVGA GT 240, GDDR5 512mb...

Memory clocks are a bit confusing. If you check MSI Afterburner tho, that's generally the number to go by. The GDDR5 says 3400mhz in most spec sheets but it's actually 1700mhz (x2).

DDR3 and GDDR3 are basically the exact same thing, just the G means it's dedicated to the video card while the non G type is probably shared with the PC, so you often see that in Laptops. GDDR5 is just the next iteration of DDR memory. It's not much faster than 3, but it is more stable (something about resending packets until it works, while DDR3 would just get an error).

Anyway, I tried OCing it a bit but it's hard to say where it's stable because mine is a PhysX card so I could only test it with Fluid Mark. It still stresses it out a lot but it's not quite the same as when rendering.

IIRC, the core clock was ok up to 650 (linked to shader, so I don't know what the shader was at) and memory was like 1750 or something. But, you can probably do better than that. You already got the memory up really high.

Basically, each card is a bit different so you just have to put in the time to figure your card out. Overclock it and then run FurMark to test it out. Watch the temps (GPUZ temps so you can see VRM and VRAM too) and make sure they don't go above 100-110C. It should run much lower tho, like 80C tops. So yeah, overclocking is really a matter of what you can get the card to do. Over clock, test, repeat. If you get artifacts, lower the core speed (or increase voltage if you have that feature). If it starts to stutter a lot, or crashes, or goes to a blank/colored/grey screen (GSOD) then it's probably the memory so drop that clock.

IMO, go in increments of 25mhz and then you can fine tune when you get instability.
October 8, 2010 4:10:49 PM

wolfram23 said:
I have an EVGA GT 240, GDDR5 512mb...

Memory clocks are a bit confusing. If you check MSI Afterburner tho, that's generally the number to go by. The GDDR5 says 3400mhz in most spec sheets but it's actually 1700mhz (x2).

DDR3 and GDDR3 are basically the exact same thing, just the G means it's dedicated to the video card while the non G type is probably shared with the PC, so you often see that in Laptops. GDDR5 is just the next iteration of DDR memory. It's not much faster than 3, but it is more stable (something about resending packets until it works, while DDR3 would just get an error).

Anyway, I tried OCing it a bit but it's hard to say where it's stable because mine is a PhysX card so I could only test it with Fluid Mark. It still stresses it out a lot but it's not quite the same as when rendering.

IIRC, the core clock was ok up to 650 (linked to shader, so I don't know what the shader was at) and memory was like 1750 or something. But, you can probably do better than that. You already got the memory up really high.

Basically, each card is a bit different so you just have to put in the time to figure your card out. Overclock it and then run FurMark to test it out. Watch the temps (GPUZ temps so you can see VRM and VRAM too) and make sure they don't go above 100-110C. It should run much lower tho, like 80C tops. So yeah, overclocking is really a matter of what you can get the card to do. Over clock, test, repeat. If you get artifacts, lower the core speed (or increase voltage if you have that feature). If it starts to stutter a lot, or crashes, or goes to a blank/colored/grey screen (GSOD) then it's probably the memory so drop that clock.

IMO, go in increments of 25mhz and then you can fine tune when you get instability.




Thanks For Relpy,,
(DDR3 and GDDR3 are basically the exact same thing, just the G means it's dedicated to the video card while the non G type is probably shared with the PC) does this mean my GPU uses Video Ram from my system RAM ( I Have 2GB system RAM) ??? But in its screen resloution option in video adapter properties (Win 7) i can see it tells:
Dedicated Video Memory : 1024 MB
Shared Video Memory :761 MB
Total Video Memory 1785 MB
Plz Clear This ?????

My Max Temp For Stock settings 550/1340/1580 (GPU/Shader/Memory Clocks) is 55C and 60C at 570/1500/1800 Mhz (GPU/Shader/Memory Clocks).....You told yours is GDDR 5 (1700*2) but mine is (900*2) since its DDR3.... not so high as yours or its enough...... i will try bumping shader from already OCD 1500 to 1600 and Core Clock from 575 to 625 or something near but one thing i lack is any voltage control its grayed out in MSI aferburner.......
Also one thing i would like to know did you GT240 has a auxillary power connecter mine does not have its uses 75W from PCI e bus only some people refer this as bottleneck in OCing GPU's without Auxillary Power connectors.....
At what price you grabbed it?? I paid for mine in USD 120.... are not they more since on Newegg GT240 is availble at 80-85 USD..... but they don't not ship to India what a pitty so shopkeeprs loot us especially in small towns.... not in Big cities of India.......

Plz reply thanks once again......
Related resources
a c 125 U Graphics card
a c 100 K Overclocking
October 8, 2010 4:16:57 PM

Shared Video Memory: 761mb... so that much of your system RAM can get shared with the GPU.

For the voltage, have you opened the options and tried to enable voltage control? If it works, just keep in mind that 0.1V is a pretty large jump. 0.05V would be a good amount to increase if it's needed.

That said, IMO just keep pushing the core + shader till you get issues. You can certainly unlock them if you want but I don't really see much point in that. It's good to just keep them linked, at least until you start to get issues then maybe you can fine tune it a bit from there.

Best solution

October 8, 2010 9:09:04 PM
Share

WRONG. GDDR3 Is Graphics Specific DDR That Has Specifically Higher Speed Ratings At Lower Voltages And CAS Rates And Are Generally 128-Bit Per Chip (Basically "Ultra Premium" Memory) Whereas Standard DDR Is Normal Spec Ram That Generallly Runs At Higher Voltages And Lower Speeds And Higher CAS Ratings And Are Generally 64-Bit Per Chip
October 9, 2010 4:06:43 AM

hella-d said:
WRONG. GDDR3 Is Graphics Specific DDR That Has Specifically Higher Speed Ratings At Lower Voltages And CAS Rates And Are Generally 128-Bit Per Chip (Basically "Ultra Premium" Memory) Whereas Standard DDR Is Normal Spec Ram That Generallly Runs At Higher Voltages And Lower Speeds And Higher CAS Ratings And Are Generally 64-Bit Per Chip



Yes this must be it........ otherwise why there are samsung memory chips on my GPU if it gets its memory from system despite of being a dedicated GPU................
October 9, 2010 4:09:57 AM

wolfram23 said:
Shared Video Memory: 761mb... so that much of your system RAM can get shared with the GPU.

For the voltage, have you opened the options and tried to enable voltage control? If it works, just keep in mind that 0.1V is a pretty large jump. 0.05V would be a good amount to increase if it's needed.

That said, IMO just keep pushing the core + shader till you get issues. You can certainly unlock them if you want but I don't really see much point in that. It's good to just keep them linked, at least until you start to get issues then maybe you can fine tune it a bit from there.



I tried that too in MSI afterburner(checked the enable voltage settings )then also i am unable to change it (voltage still grayed out) what is the reason???
Its main limiting agent in my case ........
a b U Graphics card
October 11, 2010 6:50:20 AM

Last I checked GDDR3 is actually based of DDR2 (so only double pumped) while GDDR5 is based of DD3 (both are quad pumped).

April 26, 2011 6:13:53 AM

Best answer selected by khicharkumar.
April 28, 2011 12:31:56 AM

hella-d said:
WRONG. GDDR3 Is Graphics Specific DDR That Has Specifically Higher Speed Ratings At Lower Voltages And CAS Rates And Are Generally 128-Bit Per Chip (Basically "Ultra Premium" Memory) Whereas Standard DDR Is Normal Spec Ram That Generallly Runs At Higher Voltages And Lower Speeds And Higher CAS Ratings And Are Generally 64-Bit Per Chip


:) 

BTW Ive Updated My Machine;

AMD Phenom II X3 740(Stock 3GHz w/4Th Core Unlock)On MSI 790-G45 @ 3.4GHz/6GB DDR2-800 CAS4|GTX460SE @1GB 850,1700,1.9 <AND FOR> (PhysX & CUDA) 9600GSO @ 233,966,800|Dual RAID-0 10K Raptors + 500GB Supplamental WDGreen|Dual Monitors; Dell 20" S-IPS Panel & 19" HDTV|Windows7 Professional 64-Bit

P.S (LOL) Does Anyone Want A Thinkpad T43? 4Sale (Pics And Vid On My Facebook) Hella.Dizzo

!