Communities

tag:snake search within a tag
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
created:<1w created < 1 week ago
post_type:xxxx type of post
Q&A

Driving LED with NPN transistor from I/O pin

+4
−0

I'm trying to understand a circuit for driving a LED found on a board I purchased.

Below is the circuit driven by an I/O pin (HS2) of a small 3.3V processor.

The HS2 pin is driven by the I/O with these specs:

V(high) = 0.8 · Vsupply = 2.64 V

V(low) = 0.1 · Vsupply = 0.33 V

The I/O pin can be set to drive up to 28 mA, but I am guessing the design of the circuit would rather pull the current from Vsupply rather than the I/O controlling pin (HS2), and that is the reason for using the NPN transistor (S8050) to drive the LED.

The LED is a SMD 3528 white LED for which I do not have specs, but testing shows it lights up nicely at 40 mA, and does fine at 11 mA, with 3.24 V across.

I am wondering if the reason for this design is because this bright white LED is typically looking for a 3 V drop, and if they used a current limiting resistor in series with the LED, then maybe the resistor would drop too much and bring the voltage across the LED to less than 3 V.

How is the 1k/10k voltage divider working to set the current through the flash?

Does this circuit seem like a good one to duplicate if I wanted to create additional LEDs driven by other pins on the processor?

When the transistor is on, is it setting the current, or will the current vary per the gain of the transistor?

I see how this circuit is acting as a switch, but is it also fixing the current through the LED? Or is the current through the LED dependent on the gain of the transistor?

Why does this post require moderator attention?
Why should this post be closed?

+4
−0

First, let's redraw the circuit a little more clearly, with logical flow left to right. This also protects the answer from possible changes to the question.

As you say, Q1 is a switch to control the LED. Most likely, a separate transistor was used to keep the LED current out of the microcontroller. If the micro was driving the LED directly, the LED current would have to go either thru the micro's Vdd or Vss pins. Those have a fixed current budget, which might have been tight in this design. Using a separate transistor to take the current is a reasonable thing to do.

R1 limits the I/O pin and base currents when the I/O pin is high. At 3.3 V, and assuming 700 mV for the B-E drop of the transistor, 2.6 mA will flow thru R1, most of which goes into the base.

I'm not sure why R2 is there. Maybe there was a reason to make sure Q1 was really off and not picking up stray signals when the micro is powering up and the I/O pin is still in high impedance before the firmware sets it up. Still, this smells more like something done for religious reasons.

The rest of the circuit is a bad idea unless this is a high volume cheap product where reliability doesn't matter much. Something like a toy would fit in this category, for example.

The basic problem is the LED current can vary widely, and even be out of spec. You measured this one sample, and found it to draw 11 mA with 3.24 V across it. The next one could easily draw twice or half that due to part variations alone.

This LED is being voltage-driven, which is bad. Once the diode starts substantially conducting, small changes in the voltage cause large changes in the current. The voltage to get the right current changes with temperature and across parts. Generally there is no single voltage, even if you could accurately guarantee one, that results in high enough current to light well, but not so high as to be out of spec. Note that at high temperature, the C-E drop of the transistor will be lower, and the LED current as a function of voltage will be higher. And, what is the variation of the "3.3 V" supply across units?

The reason for this kind of drive is probably that it was just simpler, and the loss of reliability and repeatability in this design wasn't worth more cost and board space to address.

The easiest way to control the LED more tightly is with a higher supply voltage, like 5 V. That leaves some room for a resistor in series to control the current. That may not have been available, and the reliability wasn't worth the cost.

The HS2 pin is driven by the I/O with specs of V(high) = .8(Vsupply) = 2.64V, and V(low) = .1(Vsupply) = .33V .

You need to look at the datasheet again. First, make sure these are the specs for the output voltages, not the input voltages. Second, these levels are probably at maximum current. Most likely, you are getting much closer to 3.3 V and 0 V in reality. In particular, when the pin is driving low, it must be close to 0 V since it's not sinking any current.

Does this circuit seem like a good one to duplicate if I wanted to create additional LED's driven by other pins on the processor?

Not unless you understand the consequences of possibly running the LED out of spec, and are willing to put up with them.

When the transistor is on, is it setting the current, or will the current vary per the gain of the transistor?

The transistor gain is not limiting the current in this design. You have 2.6 mA base current. Even at 30 mA LED current, that only requires a gain less than 12. I didn't look up your transistor, but cheap small signal transistors can easily be found to guarantee a gain of 30 or more in this case.

The problem is that the LED current will vary across LEDs, over temperature, and with small variations in the supply voltage.

Why does this post require moderator attention?

+1
−1

No need to power the led with 5V. 3.3 V should suffice!

Before I come to problem of regulating the current with only 3.3V, I wish to state some general simple constructions. Technically speaking, the following current sink/source is perhaps insufficiently known:

The current is set by I = (V_zener-0.7)/R1. This is a current sink driving a led with 20mA, but with you can obtain a dual circuit with a pnp in place of the npn if you need a current source.

This current sink/source is not only simple, is is also essentially FLOATING; that means that you can replace the ground with a conducting wire in your circuit. This can be especially useful to limit the current from above (I mean not near the ground), even at high voltage (say up to 1000V), and assuming you chose a transistor with suitable rated voltage. Notice also that this circuit can be used as a current source or a current limiter. The fact that this circuit is essentially floating seems not to be well known, as even the Art of Electronics has no other clue than using a high voltage depletion mode mosfet to this purpose.

The above current source has 2 drawbacks: First, it is not very precise, and the current sunk may varies somewhat because of different factors (like temperature etc.). Second, you need a supply voltage at least a bit larger than the zener voltage, and it is difficult to find zener diodes with zener voltage below 3.3V. Nevertheless, the above circuit is still valuable to power LEDs whenever the supply voltage may vary a lot, like in the schematic above where the power supply is allowed to vary from 6 to 30V, for a LED current of 20 mA.

Now, the first drawback above can be easily overcome thanks to the inexpensive TL431 voltage reference:

As above, the current limit is set by I = (V_ref-0.6) / R1 = 1.9/R1. Then you have a stable current source that works with a supply voltage from V_ref = 2.5 V to 36V (but the LED needs an additional 3V drop so, this will work from 6V to 36V for a LED). With the component values above, the current sunk is exactly 20mA.

Now let us return to the original problem of the OP of regulating a 3V LED with only 3.3V of supply current. As pointed out by Lorenzo Donati in his comments (acknowledgments), my above solutions cannot work because there is not enough room for the 3V drop of the LED. Olin has pointed out that using a supply of 5V to provide some room is the adequate solution. Nevertheless, just for showing this can be done, I provide below a solution with a 3.3V supply.

The first solution is a current sink with a 2n3904 and an oamp.

The TLC2272 is one of my favorites, but you can probably do the same with a common LM358.

The current is set with $I_{reg} = V_{in+}/R1$, where $V_{in+}$ is set with the voltage divider, or better, in other circumstances where the supply may vary, with a voltage reference or a Zenner and a voltage divider.

The R1 resistor should be low to reduce as much as possible the voltage drop.

In the above schematic, the current sunk is 17mA according to my simulations, which is probably sufficient in most cases.

This is still not optimal as the npn has a small, but still too high, voltage drop at saturation.
If you want to be really pedant, it is possible to use my favorite IRF7204 MOSFET. It's good to know it: it can switch with only 3V of threshold voltage.

Here is the schematic for a current source driving the LED (pay attention that the input terminals of the oamp have been inverted):

This times, the voltage drop for the LED is almost equal to the supply voltage, which provides even more room for driving a LED at its optimal power.

Why does this post require moderator attention?

+0
−0

I have tried to drive a series of smd led with constant current. If it is rated for 3.3V, I run it at around 3V. It doesnt get hot even without a heatsink, ~60C. So thermal runaway is only a problem under extreme and unlikely circumstance. This also results to higher power efficiency theoretically. This is a more cost effective solution than using a CC source and heat sink. But if you include the time to solder hundreds of smd... What could be the downside of this approach that it is not popular?

Why does this post require moderator attention?

+0
−0

If you choose suitable ultrabright LEDs , there is no need to use 40 mA. I have over 10 thousand LEDs all over 16,000 mcd at 20 mA which is painfully bright for close indicator but works well 100 m away. They are also all matched brightness and Vf within 10%, but I ordered these special for a high volume user. Otherwise, you can get very wide tolerances from batch to batch from unknown sources.

Most uC now are using advanced 74ALCxx type CMOS which has a nominal driver impedance of around 22 ohms +/- 50%.Ro= Vol/Io

This is more than the internal resistance of a small 5mm (15 ohm) or 50 mW SMD LED so that you can drive it directly from the uC with a series R if you know your LDO is 1 or 2% accurate.

Using your LED measurement of 3.24@11mA for this simulation.

I would choose ultrabright white 5mm LED's 16 Cd @ 20 mA and run them at 5 mA and add a series R something like 50 ohms. But I would do a tolerance stackup calculation.

Most good 5mm white LEDs these days will be 2.9V at 11 mA, so I don't know what you have. Then 3.0 to 3.1 at 20mA rating.

Why does this post require moderator attention?