Just bought an ECS 9800 GTX+ and GPU-Z says it has a 65nm core. WTF?
The temperatures at idle are usually ~50c. But when under load (Crysis GPU test), temperatures get up to just under 80, which is HIGHER than the original 9800 GTX!
And yes, I have a properly cooled system.
GPU-Z info
Name: NVIDIA GeForce 9800 GTX/9800 GTX+
GPU: G92
Revision: A2
Technology: 65nm
Die Size: 330mm(2)
Release Date: Apr 01, 2007
Transistors: 754M
BIOS Version: 62.92.62.00.60
Device ID: 10DE - 0612
Subvendor: Elitegroup (1019)
ROPs: 16
Bus Interface: PCI-E x16 @ x16
Shaders: 128 Unified
DirectX Support: 10.0/SM4.0
Pixel Fillrate:11.8 GPixel/s
Texture Fillrate: 47.4 GTexel/s
Memory Type: GDDR3
Bus Width: 256 Bit
Memory Size: 512 MB
Bandwidth: 70.4GB/s
Driver Version: nv4_disp 6.14.11.7783 (ForceWare 177.83) / XP
GPU Clock: 740 MHz - Memory: 1100 MHz - Shader: 1836 MHz
NVIDIA SLI: Disabled
Is this 65 or 55nm?
The temperatures at idle are usually ~50c. But when under load (Crysis GPU test), temperatures get up to just under 80, which is HIGHER than the original 9800 GTX!
And yes, I have a properly cooled system.
GPU-Z info
Name: NVIDIA GeForce 9800 GTX/9800 GTX+
GPU: G92
Revision: A2
Technology: 65nm
Die Size: 330mm(2)
Release Date: Apr 01, 2007
Transistors: 754M
BIOS Version: 62.92.62.00.60
Device ID: 10DE - 0612
Subvendor: Elitegroup (1019)
ROPs: 16
Bus Interface: PCI-E x16 @ x16
Shaders: 128 Unified
DirectX Support: 10.0/SM4.0
Pixel Fillrate:11.8 GPixel/s
Texture Fillrate: 47.4 GTexel/s
Memory Type: GDDR3
Bus Width: 256 Bit
Memory Size: 512 MB
Bandwidth: 70.4GB/s
Driver Version: nv4_disp 6.14.11.7783 (ForceWare 177.83) / XP
GPU Clock: 740 MHz - Memory: 1100 MHz - Shader: 1836 MHz
NVIDIA SLI: Disabled
Is this 65 or 55nm?