i need help with wiring up some pieces of a 12V LED strip in series to run on a 24V 150mA supply. i want to add resistors , to help keep the strings evenly balanced. i don’t know if it is necessary, especially because the ones in each LED series; But i don’t think so, from looking around the internet. i think it matters and i am not entirely sure why, but i think it keeps them even in power distribution, and hopefully keeps burn out from occurring as easily.
i am puzzled because i had found a video that said to use 20% more watts than your LED strip was rated. i don’t like that though.
i don’t know what it’s rated anyway , only that the LED set is 3 regular 3014,
likely FM-3014WLS-460T-R80 (or the 70 CRI one FM-3014WLS-460R-R70 which has the same specs) from Nationstar (it’s the warm white LED strip )
~3 V (3.2 VF) 30 mA (35 mA max) in series with a 100 Ohm resistor (code on surface mount resistor on board is “101”)
each LED being a 0.1W diode? and a strip being 12V 0.1W if i understand correctly. so parallel (as they are in a strip) that looks like 1.8W on one strip in series with another 1.8W ; so, 18 LEDs (6 units of 3) on one strip in series with another 18 LEDs, but this is wrong, due to the resistor in every unit, and could maybe be approximate (but this isn’t the point of the question. i want to know) to 18/4 ~ 4.5 units so i guess, 4 units of 3 LEDs with a 100 Ohm resistor, is maybe 1.8 W / 16 things ~ 0.1125 W per thing, but that’s rubbish but also gives room to dump energy in the resistors before the positive and negative leads and maybe in between the LED strip, as they are linked in series ?
so i am asking for help in these calculations which are over my head, and to understand how to do them (i would eventually love to learn about zener diodes, but this is a logical baby step, and i also don’t have any zener diodes :)
i am not sure if that will even work. for example, i have no clue what actual resistance each LED is giving, being that the range of volts and amps it is passing changes the resistance. (and i don’t have the volt and amp curve graph even if i could use that, the data sheet doesn’t include it or a wavelength efficiency chart)
so without that data there is only the 100 Ohms to use (unless i use 6 Ohm per LED for the heck of it because i saw it on a circuit lab LED value. but, assuming i didn’t ,
if a theoretical unit was set to 12 V , then V = I * R , 12 V= X * 100 Ohms, is 0.12 A which is high
, and V * I = W, 1.44 W for 3 LED and a 100 Ohm resistor.
i don’t know what i am getting wrong here right now.
a 0.1 LED would imply that 1.44 W - ( 3 * 0.1 W ) = 1.14 W of a resistor.
that doesn’t make sense; it’s way more energy than the LEDs. i must be doing something wrong.
i had figured that 3.2 V + 3.2 V + 3.2 V leaves 2.4 V to be filled by the resistor and 3 V + 3 V + 3 V leaves 3 V to be filled by the resistor
i also tried to do with V = I * R for the resistor
one case : 2.4 V = 0.03 A * X (if LEDs at 30mA) , X = 80 Ohm => No 2.4 V = X * 100 Ohms , X = 0.024 A (24 mA)
second case : 3 V = 0.03 A * X , X = 100 Ohms 3 V = X * 100 Ohms , X = 0.03 A
it correlated with :
i had used an equation i found that said I LED = [ V source - 3 * V LED ] / R so, I LED = [ 12 V - 3 * 3.2 V ] / 100 Ohms I LED = 0.024 A and 0.03 A = [ 12 V - 3X ] / 100 Ohms, X = 3 V and even more disappointing was .035 A = [ 12 V - 3X ] / 100 Ohms, X = 2.83333 V, which the curve that should have been on the datasheet would rule out. the 100 Ohm resistor really seems to put a damper in everything , so i don’t know what further ones will do
But anyway, please help me understand the error of my calculations and lack of understanding.
Sorry, but it is too hard to read your very long text. Using resistors to regulate current for a strip is a waste of time and power and will generate a lot of heat.
If you have 12 V LEDs then use a 12 V power supply. If you have a 24 V DC supply then you need a voltage regulator to drop the voltage to 12 V. A buck converter would be the most efficient way.
Assuming that your strips are designed to operate at 12V and 30mA, comprising 3 series LEDs and a current limiting resistor, you probably don't need to add another resistor.
If the 12V strings are well matched, meaning that they draw the same current when 12V is applied to them, then you can simply connect two strips in series and apply 24V. Connect the negative end of strip "A" to the positive end of strip "B"; connect your 24V supply to the positive end of strip "A" and the negative end of strip "B".
Putting two 12V strips in series effectively gives you a single 24V strip with the same current demand. So, your 150mA power supply should be able to drive 5 such strips but you would be operating the supply right at its specified maximum current.