LesCir Posted July 29, 2010 Report Share Posted July 29, 2010 I am setting up a small circuit that will use LED's. The source power supply is marked 12volts, but actually delivers 13.2 when tested. The LED's in question are rated 2.1v adn 2.4v (red and yellow). Here's my question: how critical are the voltages in terms of reasonable LED life and proper brightness? Based on my calculations, to get the 13.2v down to 2.1 volts for the red, I need a resistor of ~ 530 ohms. To get the 13.2 down to 2.4v needs 450 ohms. OTOH, if I went with the spec on the power supply of 12v, I'd be thinking 470K and 400 ohms. Should I just put in 470K resistors and call it a day? Or do I need to carefully get as close as reasonalbly possible (+/- 10 ohms) to get proper brightness and reasonable life from the LED's ?Thanks, Quote Link to comment Share on other sites More sharing options...
Hero999 Posted July 29, 2010 Report Share Posted July 29, 2010 You're looking at it the wrong way, you need to limit the current, the LED's characteristics determine the voltage.What's the maximum current rating of the LEDs?What's the recommended operating current?The current needs to be less than the maximum rating whatever happens. You can also connect the LEDs in series to save power but the voltages need to be added together.470k is too higher value, the LEDs will hardly light., I don't know where you got it from. 400R sounds a little low as it would give a current of 28mA which might be too high, I'd go for 1k; can you calculate the current given a supply voltage of 13.2V and a LED voltage of 2.1V? Quote Link to comment Share on other sites More sharing options...
LesCir Posted July 30, 2010 Author Report Share Posted July 30, 2010 Whoops - typo above! I meant 470 ohms, not 470K ohms. I think the are some minor errors in my earlier calculations, but after correcting them I still end up with the same rough numbers. I'm taking the 13.2V supply voltage and subbtracting the LED's operating voltage of 2.1v. That gives me a drop of 11.2v. I then take 11.2v divided by the current draw of .2 (r = v/i) and I get 11.2/.020 or 560 ohms. It looks like you are suggesting taking the 13.2 volts and dividing by .02amps, to get 660 phms. Would that be a good starting value? Quote Link to comment Share on other sites More sharing options...
Hero999 Posted July 30, 2010 Report Share Posted July 30, 2010 No, I calculated the resistor value to give about 10mA at 12V.R = (12-2.1)/0.01 = 990 Ohm which will be difficult to get hold of so I suggested 1k, if the voltage increases to 13.2V the current will be 11.1mA.You could use 560R for 20mA which means the LED will be brighter but won't last as long.The series resistor value isn't critical as long as the current is below the maximum rating and is high enough for the LED to be bright enough. Resistors are made in standard values E24 being most common (look up preferred values on Wikipedia) which means you'll find it had to get hold of 400R, 660R, 450R etc. so you'll need to choose the nearest value but be careful about rounding down because the current will be high, round up if it's more convenient. Quote Link to comment Share on other sites More sharing options...
audioguru Posted July 30, 2010 Report Share Posted July 30, 2010 Ordinary LEDs are rated at 20mA. With 20mA or 25mA they last 100,000 hours (11 years). After 100,000 hours they might be slightly dim.Their max current is 30mA or 40mA. Quote Link to comment Share on other sites More sharing options...
LesCir Posted July 31, 2010 Author Report Share Posted July 31, 2010 Thanks boys. I'll start at 1K and see how bright they are. I only need reasonable brightness and these will probably last a lifetime with those parameters. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.