Hi,
Normally i would check total number of watts on a power supply to determine if its enough to power up a PC including the most power-hungry components CPU and GPU.
Ive been reading on the net and i see posts that talk about how amps on the 12v ( im assuming the 12v is used by the CPU and GPU) and if its enough to provide the gpu. I don't really understand all this.
I'm using an Antec EA750W, and lets say i get a GTX 770 which according to Nvidia requires a minimum recommended system power of 650w and the card power is 230W
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-770/specifications
But there's no mention of amps anywhere.
Just need to have a better understand at this, im no computer scientist
And about single rail and multi-rail, should i be worrying about these too?
Normally i would check total number of watts on a power supply to determine if its enough to power up a PC including the most power-hungry components CPU and GPU.
Ive been reading on the net and i see posts that talk about how amps on the 12v ( im assuming the 12v is used by the CPU and GPU) and if its enough to provide the gpu. I don't really understand all this.
I'm using an Antec EA750W, and lets say i get a GTX 770 which according to Nvidia requires a minimum recommended system power of 650w and the card power is 230W
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-770/specifications
But there's no mention of amps anywhere.
Just need to have a better understand at this, im no computer scientist
And about single rail and multi-rail, should i be worrying about these too?