LED voltage versus current question, should be easy

Grasping DC is taking me a bit of time, old dogs learning new tricks and all.

I have a 24v supply going to a buck booster which brings the voltage up to 30 which goes to an LED.

My question is, does the wattage of the PSU matter - if it is too HIGH?
If i put a 1000W PSU on the end, does that matter? Because LEDs don't have resistance it is kind of confusing to me.

I don't think it would work like that.... the LED only takes the amps it needs, right? Just like every other electronic device out there?

 
Solution

The voltage across an LED isn't actually constant. What I said is that a small change in voltage will correspond to a large change in current due to the exponential relationship. This means that that you can usually get away with assuming a constant voltage, especially for rough calculations. Without knowing the characteristics of the LEDs you're working with I can't really say how much a 5 ohm resistor would decrease current. Do you have a link to a...

TJ Hooker

Titan
Ambassador

No, you need some sort of current limiting component (e.g. a resistor in series). Although some LED products may have something like built that in, I don't know. LEDs (and diodes in general) have an exponential voltage-current relationship. As voltage increases past the turn-on voltage, current will increase rapidly and burn out the LED.

To calculate the required resistor value: take the voltage you will be supplying to diode and subtract the rated turn-on voltage of the diode. Take that value and the current rating of the diode and plug into R=V/I to get the required resistance. Make sure the resistor is rated to dissipate enough power (P=IV).

What's the purpose of the boost converter?

Having a high wattage than required PSU won't hurt anything.
 


OK lets give the whole picture.

I have 360W PSU that can push 24v, it has adjustment and i can get it to push 30v. i want to use it to power 3 100W LEDs. At 32v the LEDs use around 60W of power and are a good brightness and not so hot they melt.

This is where my boosters comes in, bringing the 30V up to 32V or so.

I will be putting a resistor on my LEDs, but only because everything tells me i need to.

If the resistor only adds 5 ohms, then how much current is it really reducing? I guess that is kind of what i'm trying to work out. If i run the LEDs at constant voltage, then like you say - the current should also be constant. Its only when voltage is increased that it goes up...

But if we figure my 360W 24V PSU it must push 15 amps, or be capable of it. But that many amps isn't what goes through the LED - like i said at 32v the LED uses 60W. So it can't be using even 2 amps.

So that is what i don't get i guess! If the Amps on the Power supply don't matter, as you say the wattage doesn't matter - then why do i need to reduce the current for each LED? It isn't a big deal - i'm more trying to understand why. Even if the LED was set to 45V and immediatly melted itself - what good what a resistor do? It only adds a few ohms to the circuit.

This is all pretty confusing, i should have taken sparky classes in college.


 

TJ Hooker

Titan
Ambassador

The voltage across an LED isn't actually constant. What I said is that a small change in voltage will correspond to a large change in current due to the exponential relationship. This means that that you can usually get away with assuming a constant voltage, especially for rough calculations. Without knowing the characteristics of the LEDs you're working with I can't really say how much a 5 ohm resistor would decrease current. Do you have a link to a spec sheet or something?

But if we figure my 360W 24V PSU it must push 15 amps, or be capable of it. But that many amps isn't what goes through the LED - like i said at 32v the LED uses 60W. So it can't be using even 2 amps.

So that is what i don't get i guess! If the Amps on the Power supply don't matter, as you say the wattage doesn't matter - then why do i need to reduce the current for each LED? It isn't a big deal - i'm more trying to understand why. Even if the LED was set to 45V and immediatly melted itself - what good what a resistor do? It only adds a few ohms to the circuit.
When I said PSU wattage didn't matter, I just meant having more wattage than you need isn't a bad thing. Same as if you were buying a PC PSU.
You need to limit current because of how sensitive the the diode current is to changes in voltage. If your output voltage is a little higher than you think it is (or your diode's turn on voltage is a little lower than you think it is due to random variations in manufacturing), current (and therefore power) and increase substantially. Although if your LEDs are rated for 30V, then I'm pretty sure they must consist of many LEDs in series, which I think would reduce the risk somewhat.
 
Solution
You need to limit current because of how sensitive the the diode current is to changes in voltage. If your output voltage is a little higher than you think it is (or your diode's turn on voltage is a little lower than you think it is due to random variations in manufacturing), current (and therefore power) and increase substantially. Although if your LEDs are rated for 30V, then I'm pretty they must consist of many LEDs in series, which I think would reduce the risk somewhat.[/quotemsg]

They are 100W COBs, so yes many diodes in series. But from what i've read actually running the things at 100W would mean instant death. I have confirmed that at 32v they use 60W.
I don't have the resistors in front of me, and i didn't use them during testing.

Considering the LEDs and the constant voltage i'm probably rather safe. If anything went wrong it would mean a DROP in voltage, not an increase.

But it seems like the idea isn't that the resistor conditions the current for the LED - the resistor is there so that if for some reason something weird happens and all of a sudden my LEDs are getting 35v - they literally can't pull the insane current that they would need to melt themselves.... But if they DID get 35v and suddenly wanted to pull 100W - wouldn't the resistor just... overheat and burn up?

Sometimes it seems like people are using resistors to go from say, 3v to 1.5v, which is the opposite of what i'm trying to do. I want constant voltage of 32v which is being provided using DC-DC units. The other thing that i don't get is if they need a lower voltage for their LEDs, wouldn't a DC-DC step down be a much better option? (like we talked about with the fans in the other thread) Or better yet, starting with a lower voltage and stepping it up?

 
I've been reading on it a lot and i'm starting to get it!!

Basically i'm just using transistors - mosfets - to do the exact same thing as the resistors do.
However - i am doing one naughty thing where the voltage is very SIMILAR across both sides of the device. But again this proves to be OK because i'm stepping UP not down AND my boosters supply constant voltage. I can switch form 12v source to 24v source on the fly and my voltage never moves from 30.0v according to my multimeter, so i think my step-ups are working perfectly.

The idea is that if you have a 9v source and you drop it down to 3v using a resistor - if the source goes to 9.5v then the resulting voltage for the LED is NOT 3.5v. The resistor causes that .5v change to be a minuscule change in voltage on the other end of the resistor and the LED continues to operate!

I figured it out i think!

Thank you TJ. You should be more than adept. there are a lot of hard questions on Toms and they get flooded with crappy answers. You should post more man!

 

TJ Hooker

Titan
Ambassador
Some of what I'm saying came from the perspective of thinking of the little discrete LEDs you might see on a PCB when I first read "LED" in your post. Looks like it may not be relevant to the type of thing you're talking about here. Are you just using a multimeter to measure voltage and current of the LED to get power?

Considering the LEDs and the constant voltage i'm probably rather safe. If anything went wrong it would mean a DROP in voltage, not an increase.
Hmm, not sure what your thought process here is.
Edit: Reading your subsequent post, it looks like you've got it figured now /edit

But it seems like the idea isn't that the resistor conditions the current for the LED - the resistor is there so that if for some reason something weird happens and all of a sudden my LEDs are getting 35v - they literally can't pull the insane current that they would need to melt themselves.... But if they DID get 35v and suddenly wanted to pull 100W - wouldn't the resistor just... overheat and burn up?
If the LED started to draw more current, then the extra current would also flow through the resistor (because it's in series), which would cause it to drop more voltage across it, thereby lowering the voltage across the LED. Which would in turn cause the current to drop. Let's say you were running the LED with 2 A, supply at 32 V, with a 1 ohm resistor in series. This would mean 2 V across the resistor (so 4 W), 30 V across the LED. Let's say the supply then went up to 33V. If we assume the LED voltage is roughly constant, then the resistor voltage is now 3 V, meaning 9 W. Certainly manageable with the an appropriate resistor. Although if you're expecting significant deviations is supply voltage, you wouldn't necessarily just use a resistor.

As to why people would use a resistor over a DC-DC converter? Far simpler and cheaper.

Edit:
Thank you TJ. You should be more than adept. there are a lot of hard questions on Toms and they get flooded with crappy answers. You should post more man!
Thanks! I actually have a B. Sc. in electrical engineering, but I haven't really had a chance to use a lot of what I learned in school since I graduated (2015), especially circuit level stuff like this. So whenever I see a questions like this I always leap to answer it. And it often results in me trying to remember stuff I haven't thought about in a while, or doing some reading to brush up on a subject, so hopefully it's helping me hold on to at least a few things I learned, haha.