So I understand....dead horse here in many ways...but just for my own clarity (and sanity).
The Radeon HD5850 requires 75w per PCI-E connection.
It is assumed you understand it belongs in the 12v Rail.
75w / 12v = 6.25a
So with:
2 PCI-E Connections to 1 HD5850 that means
6.25a x 2 (For both connections to the one HD5850) = 12.5amps (150w).
01.) So one HD-5850 only really consumes (Or at least is required by ATI) to need 12.5 amps?
02.) If the aforementioned is true, then why are there people saying it needs 26 amps (Some saying as high as 40a), is this just a peak thing?
My whole reason for this question is I have the 18a/12v per rail kinda setup going and putting two of these rails that both supply 18a per into HD5850 would give me 36a....more than enough right?
If I'm understanding this correctly, thank God, I will stop beating this horse, otherwise shoot me.
The Radeon HD5850 requires 75w per PCI-E connection.
It is assumed you understand it belongs in the 12v Rail.
75w / 12v = 6.25a
So with:
2 PCI-E Connections to 1 HD5850 that means
6.25a x 2 (For both connections to the one HD5850) = 12.5amps (150w).
01.) So one HD-5850 only really consumes (Or at least is required by ATI) to need 12.5 amps?
02.) If the aforementioned is true, then why are there people saying it needs 26 amps (Some saying as high as 40a), is this just a peak thing?
My whole reason for this question is I have the 18a/12v per rail kinda setup going and putting two of these rails that both supply 18a per into HD5850 would give me 36a....more than enough right?
If I'm understanding this correctly, thank God, I will stop beating this horse, otherwise shoot me.