1x 6pin to 8pin adapter question

dfk

Distinguished
Dec 23, 2013
341
0
18,860
for a gpu that requires 2x 8pin, and a Seasonic s12ii 620w that only has 1x 8pin and 1x 6pin, can you use just the 8+6 straight without adapter?

i know this would mean the gpu getting only 75w on the 6pin so it's ok although in reduced performance mode?

if using a 1x 6pin to 8pin adapter, does it have actual GND or is just to trick the GPU to let it draw 150w? and how do i know if the wire gauge on that psu/adapter is capable of that draw?

6 pin:
+12V GND
+12V sense
+12V GND

8 pin:
+12V GND
+12V sense
+12V GND
sense GND

it seems 8 pin adds a GND and sense, does it mean 6 pin is able to deliver same power as 8 pin?

 
Solution
You need all the pins filled in, you can't miss any. The problem with those 6-pin to 8-pin adapters is wires get split which means there is a faster current on the primary "root" wires (before the split) which can cause them to get too hot. What is your GPU anyway?
 


'faster current' wft is that.
 

dfk

Distinguished
Dec 23, 2013
341
0
18,860
gpu is 980ti.

if so, which would be preferable option?
1. dual molex to 8 pin adapter
2. 1x 6 pin to 8 pin adapter

is it not safe to use straight 8+6 from the psu without adapter? because the 6 pin should be backward compatible? just that it will only draw 75w since it is 6 pin right?
 


I only think it's proper to say "faster" instead of "more" since current is a rate. Kind of like "faster velocity" instead of "more velocity".

Though actually I should have used the term "higher" since current is also dependent on wire thickness.
 
No current is not dependent on wire thickness (it is but only very marginally as the wires can easily be thought of as not contributing to the resistance of the system in this case), current density is dependent on wire thickness.
Why do you think it is proper to go against the way that every electrical engineer would say it?
higher current, greater current, more current, all acceptable. Faster might be acceptable in an AC situation if there is a change in frequency, although simply saying higher or lower frequency is more correct.
 

Zerk2012

Titan
Ambassador


With that power supply I would use either of what you have listed and not think twice.
 


I'm not saying that current is dependent on wire thickness, I'm saying that the velocity charge flow is dependent on the current + the wire thickness. I worded it a bit wrong initially. More current doesn't

A ton of electrical engineers say a ton of stuff wrong (oh here I go). They say current flows instead of charge flows. They say a ton of stuff that even gets themselves confused half the time. They say stupid things like "discharging capacitors" when the discharged capacitors still have the same amount of charge as the charged ones (rather than saying something like de-energizing the capacitor). And they throw around the word "electricity" like it actually means something. A lot of them don't even understand how transistors work, even though they understand the mathematics and how to configure them.

If the current in a river started speeding up, would you say there is more or faster current? I'd say faster.

Edit: I'm wrong, the current in the river doesn't speed up, the water does. Well, I guess it could go wither way, but why do people love to talk about current and hate to talk about water?
 


The S12ii 620W?
 

dfk

Distinguished
Dec 23, 2013
341
0
18,860


ok thanks.

i just want to understand for curiosity's sake and learn why if it is not ok to use 8+6 straight, since the 2 extra pins is GND and this should let it draw up to 75w on that 6pin, so 75w less compared to an 8 pin.

it will be getting 150w from the main 8pin as well as 75w from the motherboard PCIE slot. the 2nd 6 pin then gives 75w so total 300w is available to the card.

using a 6 pin to 8 pin adapter would mean the GPU senses it can draw 150w from that 2nd 6 pin (now 8 pin) bringing it up to 375w available. but is it really safe with this adapter trick?
 


Really any wire can theoretically maintain a certain value of current, it's all just about how it heats up and if it would (and hopefully not) burn up. Higher power about those wires would be more heat. Increased heat of wires can add resistance and cause the voltages to drop some.

In reality, the 980Ti is about a 250W card, and some of that energy comes from the PCIe slot on the motherboard, so 75W from the motherboard, 150W from each 8-pin cable makes a theoretical 375W of capability, which the 980Ti will not come near to. Using a 6-pin to 8-pin converter cable, I can't say for sure how hot it will get. I guess that'd depend on how the 980Ti distributes its energy about its different input wires.

I'd say you'd probably be fine. Those 75W and 150W ratings also could be taken lightly, they are not definite numbers, like if power on that one cable reaches 151W the whole world is going to flip upside down.
 

Zerk2012

Titan
Ambassador


No problems your fine believe me. For just gaming it draws in the 250 watt area.
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-7.html
 
Solution

dfk

Distinguished
Dec 23, 2013
341
0
18,860


:D so theoretically, if i cant get the adapter so soon, i can still run 8pin+6pin from my psu and provide up to 300w to the 250w 980ti, which should be enough if not heavy stress like benchmarking or OCing right?