Jump to content
Electronics-Lab.com Community

SImple question on proper diode/LED voltage supply


Recommended Posts

I am setting up a small circuit that will use LED's. The source power supply is marked 12volts, but actually delivers 13.2 when tested. The LED's in question are rated 2.1v adn 2.4v (red and yellow).

Here's my question: how critical are the voltages in terms of reasonable LED life and proper brightness? Based on my calculations, to get the 13.2v down to 2.1 volts for the red, I need a resistor of ~ 530 ohms. To get the 13.2 down to 2.4v needs 450 ohms. OTOH, if I went with the spec on the power supply of 12v, I'd be thinking 470K and 400 ohms.

Should I just put in 470K resistors and call it a day? Or do I need to carefully get as close as reasonalbly possible (+/- 10 ohms) to get proper brightness and reasonable life from the LED's ?

Thanks,

Link to comment
Share on other sites


You're looking at it the wrong way, you need to limit the current, the LED's characteristics determine the voltage.

What's the maximum current rating of the LEDs?

What's the recommended operating current?

The current needs to be less than the maximum rating whatever happens.

You can also connect the LEDs in series to save power but the voltages need to be added together.

470k is too higher value, the LEDs will hardly light., I don't know where you got it from. 400R sounds a little low as it would give a current of 28mA which might be too high, I'd go for 1k; can you calculate the current given a supply voltage of 13.2V and a LED voltage of 2.1V?

Link to comment
Share on other sites

Whoops - typo above! I meant 470 ohms, not 470K ohms.

I think the are some minor errors in my earlier calculations, but after correcting them I still end up with the same rough numbers. I'm taking the 13.2V supply voltage and subbtracting the LED's operating voltage of 2.1v. That gives me a drop of 11.2v. I then take 11.2v divided by the current draw of .2 (r = v/i) and I get 11.2/.020 or 560 ohms.

It looks like you are suggesting taking the 13.2 volts and dividing by .02amps, to get 660 phms. Would that be a good starting value?

Link to comment
Share on other sites

No, I calculated the resistor value to give about 10mA at 12V.

R = (12-2.1)/0.01 = 990 Ohm which will be difficult to get hold of so I suggested 1k, if the voltage increases to 13.2V the current will be 11.1mA.

You could use 560R for 20mA which means the LED will be brighter but won't last as long.

The series resistor value isn't critical as long as the current is below the maximum rating and is high enough for the LED to be bright enough. Resistors are made in standard values E24 being most common (look up preferred values on Wikipedia) which means you'll find it had to get hold of 400R, 660R, 450R etc. so you'll need to choose the nearest value but be careful about rounding down because the current will be high, round up if it's more convenient.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
  • Create New...