How to increase voltage to a USB port

Sean_34

Reputable
Feb 5, 2016
62
0
4,640
Hey, so when I want to use my front I/O array to charge my phone, it is painfully slow. Is there any way to see the default voltage/amperage of a USB port on your system and then increase it?
 
Solution
Actually, I thought it was well known that virtually no USB ports in the real world follow the spec:

What Your Mom Didn't Tell You About USB
With any standard, it's interesting to see how actual practice diverges from the printed spec or how undefined parts of the spec take shape. Though USB is, with little doubt, one of the best thought out, reliable, and useful standards efforts in quite some time, it has not been immune to the impact of the real world. Some observed USB characteristics that may not be obvious, yet can influence power designs, are:
•USB ports do NOT limit current. Though the USB spec provides details about how much current a USB port must supply, there are mile-wide limits on how much it might supply...

MasterWigu

Honorable
Aug 19, 2016
121
2
10,765
No, that is defined by the hardware, you can't change it. Some motherboard manufacturers release programs to increase the charging speeds, see if your motherboard manufacturer has one for your mobo.
 

USAFRet

Titan
Moderator


Buy a dedicated port that plugs into the wall.
 

Sean_34

Reputable
Feb 5, 2016
62
0
4,640
I obviously own wall chargers but that's not what my question pertains to. Also, if it was "defined" by the hardware, then you wouldn't have android transformers that charge at 3 amps, 2 amps or those awful ones at 1.2 amps. USB is simply an array of contacts, a connector made up of 2 power and 2 data contacts soldered onto copper wire. It is the source that regulates power delivery to a load (source, power, load; the 3 basics needs of an electric circuit). I could apply 15v DC or 115v AC through a simple USB cable, ignoring the fact that the latter would probably melt the thin conductor, the point remains, there is nothing about the connector itself deciding any voltages or amperage.

So, again, is there a way to tell my Mobo to increase the throughput voltage to a USB on my front I/O.

I'd imagine it's similar to when you go into the bios and increase voltage multipliers to your processor and GPU.
 
As it's an external port you could just unplug the power + ground wires from the board and wire it to red 5v and black on any molex for amperage limited by only when the wires catch on fire.

The USB spec is only 5v 0.1A which is 0.5w, which can be enumerated/negotiated by the device up to 0.5A or 2.5w. Those Dedicated Charging Port (DCP) phone chargers rely on either shorting of the D+ line to the D- (Android) or voltage on those lines (Apple, because they want to be different) as a signal to increase the amperage to 1.5A or even more. That's why phone/tablet chargers are either Apple or Android compatible.

BTW USB 3.0 spec is 0.15A to 0.9A so a little better.
 

USAFRet

Titan
Moderator


You'd imagine incorrectly.
 
You probably don't want to increase the voltage, although USB charged devices seem to be incredibly robust.

The new USB Power Delivery spec will probably only be ever seen in USB 3.1 and requires compatible devices on both ends plus a special cable but can supply up to 20V @ 5A for 100W charging of laptops.

If you were going to modify your powered device anyway I suppose it could be made to rectify 120v AC like this custom Power-Over-Ethernet cable
20426_433f1c3487ab33d889e4ed9874077abd_large.jpg

Reminds me of this fellow who made a number of such adapters.
 


The USB specification is not defined by the hardware, the hardware is defined by the USB specification. The whole idea of a specification is to provide a standard that everybody can use to insure interconnectivity. What would happen if, you increased the voltage on YOUR USB port, then plugged a device only expecting the standard into it? It would create a dangerous situation, and if the device was damaged, then who would be liable for fixing the device?
 
Actually, I thought it was well known that virtually no USB ports in the real world follow the spec:

What Your Mom Didn't Tell You About USB
With any standard, it's interesting to see how actual practice diverges from the printed spec or how undefined parts of the spec take shape. Though USB is, with little doubt, one of the best thought out, reliable, and useful standards efforts in quite some time, it has not been immune to the impact of the real world. Some observed USB characteristics that may not be obvious, yet can influence power designs, are:
•USB ports do NOT limit current. Though the USB spec provides details about how much current a USB port must supply, there are mile-wide limits on how much it might supply. Though the upper limit specifies that the current never exceed 5A, but a wise designer should not rely on that. In any case, a USB port can never be counted on to limit its output current to 500mA, or any amount near that. In fact, output current from a port often exceeds several Amps since multi-port systems (like PCs) frequently have only one protection device for all ports in the system.
That said, increasing voltage would probably be a bad idea and difficult to implement as it would have to be either boosted from 5v or bucked down from 12v with a voltage regulator.

The only BIOS options I've ever seen for USB are to turn them on/off when the computer is off, to toggle them between SuperSpeed, HiSpeed and FullSpeed, and to emulate legacy PS/2 keyboards and mice.
 
Solution

Sean_34

Reputable
Feb 5, 2016
62
0
4,640


Exactly what I was trying to get at: the form factor of the connector doesn't force a certain amount of power to flow, so there are, in fact, "mile-wide" limits. I was just hoping that the regulator in the motherboard that monitors that power was fitted with a way to alter it. Your point about having to step the voltage up or down makes sense and I always forget the formula for converting amps and voltages and such, so I'm not prepared to advocate doing it.

Thanks for the reply.
 

Sean_34

Reputable
Feb 5, 2016
62
0
4,640


Now, that's a lot more than I was planning on, but now you've piqued my interest and I've been looking for a reason to buy a soldering iron, haha. It seems he did this to power a Raspberry Pi tho, or perhaps I'm misreading.

Wouldn't doing this draw the 5v and the 24 amps without any regulation? It's the voltage that charges a battery, but I feel like that amperage would do some damage, even if I can't remember how, at this point in time.
 
There are some USB ports are said to specifically for (fast?) charging but I don't know whether they do it by increasing voltage or amperage or both.

A slight increase of voltage should be OK for your other standard devices, am thinking less than 0.2v, and an increase in amperage.... that's governed by the physical hardware trace, as some of you know, carrying more current (amp) requires a beefier cable (trace). I say software can increase voltage but not amperage.

I know lots of mumbo-jumbo, but hey, if it was so obvious you wouldn't have to ask right.
 
It's the device that determines how many amps are drawn. Just because your car battery can supply "1000 cranking amps" (at 32F) does not mean either the starter or the car ever actually uses 1000A--either would catch on fire if it ever did for any length of time. That's why dropping a wrench across the terminals is so dangerous--the dumb device (the wrench as a dead short) actually will pull 1000A which is enough to weld itself to the terminals! Then the battery may explode.

The USB cable is made of ~28GA wire so would melt like a fusible link if you tried to pull >10A through it for more than a few seconds. The horrendous voltage drop means a good power supply may not even notice as the insulation burns.

If you actually read my link to the tests of cheap USB chargers you'll note the counterfeit iPhone charger puts out a very noisy 6v and yet unlike dead people we do not hear about an epidemic of dead iPhones, so yes the devices are probably robust enough to handle some extra voltage. The question is where are you going to easily get that kind of voltage in a PC?

In the olden days PSUs had pots inside you could adjust to say, up the 5v line for all attached devices. You can still put a resistor on the sense wire of the VRM IC. Even if you cranked it up to 5.5v that would only charge your phone 10% faster. Is the risk to all other devices worth the benefit?