HomeTutorials › How To Power An LED

How To Power An LED

While very basic, how to power an LED is probably one of our most commonly asked questions. Here are a few simple steps to getting started with LEDs on a breadboard. The math behind choosing a current limiting resistor is also covered in this tutorial.

A Few Considerations:
• While there are special LED driver chipsets out there, we are going to keep this to basic components.
• We are going to stick to low powered LEDs. For ultra bright LEDs other components may be required.

How It Works:
An LED (Light Emitting Diode) is a basic semiconductor that emits light when powered. Just like a diode, these work like a one way door for electricity. Without getting too much into the physics of it, when power travels through the LED, light is emitted from this diode by way of electroluminescence. LEDs are very efficient compared to incandescent lighting and found their way into just about every piece of electronics – so it is good to know how to use them!

The Parts Needed:

intro picture

This tutorial will be requiring a few things:

The Diagram

This handy little diagram shows where each of the parts go. Don’t worry if it looks a little overwhelming, we will be going through this step by step!

led diagram


blinky 2

As mentioned above, an LED is a Diode that emits light. Diodes work as a one way door for electricity and will only allow current to pass in one direction. While not the most difficult issue to solve, it is nice to know how to connect an LED up so that it works the first time, especially when they are being soldered into a circuit! Standard LEDs like we are using in this tutorial (and carry in the shop) will always have a longer lead and a shorter lead. The longer lead is the Anode and will always be connected to the positive side of your circuit. The shorter lead is known as the Cathode and will always go to the ground / negative side of your circuit. Keep this in mind when inserting it into the breadboard. In the breadboard diagram above the anode is the pin that has a kink right underneath the LED.

Current Limiting Resistor

LEDs have a Forward Voltage and a Current rating. Simply hooking up an LED to our battery pack would likely cause it to get quite hot, and eventually fail. The reason it gets hot and fails is that the battery has a higher voltage than the LED requires. The current that flows through an LED is exponential to the voltage across the LED so even a small increase in voltage over the Forward Voltage of the LED will result in a huge increase in the current (as well as a bright flash, some heat, and a dead LED). That is why we need to use a current limiting resistor.

Unfortunately there is a little math involved since the required resistor will change depending on the input voltage, the LED, and number of LEDs in series. We will tackle that in the next section.

Calculating The Required Resistor

The formula we will be working with is quite simple – we just need to plug in a few values.

  • LED Forward Voltage – Typically found on the LED datasheet (or our product page)
  • LED Current – Also found on the LED datasheet (or our product page)
  • Input Voltage – This is the voltage of our power supply (the batteries in this case)

To calculate the resistance in ohms we will simply subtract the LED Forward Voltage from the Input Voltage and divide it by the LED Current (in Amps, not milliamps!).

So with our Green LED: If our Input Voltage is 6.0V (4 AA Batteries at ~ 1.5V each), our LED Forward Voltage is 2.1V (found on the product page), and our LED Current is 20mA (found on the product page) , then it would look like this:

6.0V - 2.1V = 3.9V        //Battery voltage minus LED Forward Voltage.
3.9V / 0.02A = 195 Ohms   //Resulting Voltage divided by LED Current (Be sure to convert your 20mA to a value in amps)

Our ideal resistor would be 195 Ohms. Since a 195 Ohm resistor isn’t very common, we would go to the next highest common value which is a 220 Ohm resistor. If you don’t happen to have one laying around, typically going a little higher isn’t going to hurt anything.

Next, lets figure out the Red LED; 6.0V input, the LED Forward Voltage is 1.85V, and the LED Current is 20mA so:

6.0V - 1.85V = 4.15V
4.15V / 0.02A = 207.5 Ohms

Again, a 207.5 Ohm resistor isn’t exactly common, so we would go to the next highest common value of resistor which is 220 Ohms.

Calculating The Required Resistor – Part 2

So we have figured out the resistance value of the resistor we will need – but there is one more thing we do need to consider with resistors: Their thermal rating (How much power can they dissipate before they get too hot!) – this is measured in Watts. Most common resistors are rated at 1/4 Watt and that will generally cover most applications but lets do the math just to be sure.

We need to calculate how many Watts the resistor will have to “burn off”. To do this we need to do a little more math; so let’s start with the Green LED again:

First we need to know how much current the LED will be drawing – our ideal resistor of 195 Ohms would have meant we would draw exactly 20mA, but since we are not using that resistor the LED will actually draw a little less. To figure that out, we will just reverse the equation we used above. Our known values are:

  • the resistor at 220 Ohms
  • the Input Voltage from the batteries at 6.0V
  • the LED Forward Voltage which is 2.1V

So when we change the equation to get the LED Forward Current it will look like this:

6.0V - 2.1V = 3.9V                       //Battery Voltage minus LED Forward Voltage
3.9V / 220 Ohms = 0.01772A (or 17.7mA)   //Resulting Voltage divided by the Resistor we will be using 

So the total current being drawn through the circuit will be 17.7mA – good to know, this much current has to travel through the resistor – but it isn’t exactly what we are looking for. We need to figure out how many Watts. To do that, we need to multiply the total current by the voltage. Since the LED “consumes” 2.1V of the 6.0V we start with, the resistor only deals with the rest, 3.9V in this case. The math would look like this:

6.0V - 2.1V = 3.9V                           //Battery Voltage minus LED Forward Voltage
3.9V * 17.7mA = 69.03mW or 0.06903 Watts.    //Resulting Voltage multiplied by the total current consumed

Since our resistor is rated at 250mW (1/4 W) and our circuit only uses 69.03mW – that will work! The math for our Red LED would look like this:

6.0V - 1.85V = 4.15V
4.15V / 220 Ohms = 0.01886A (or 18.9mA)

6.0V - 1.85V = 4.15V
4.15V *18.9mA = 78.28mW or 0.07828 Watts. 

So that one would also work! The easiest way to think of this is: As the Voltage difference between the Input and the LED Forward Voltage increases, or the Current of the LED increases, the requirement for a larger resistor will start to be an issue.

So Where Does This Resistor Go?


Ok, now that all of the math is out of the way, lets talk about something a lot easier: How to wire it all together! The only thing that really matters is that the LED’s Anode is connected to the positive (power) and the LED’s Cathode is connected to the negative (ground). Since this resistor is only being used to limit current through the circuit, it can actually be located on either side of the LED. Placing the resistor on the positive (anode) side of the resistor will have no differing effects from placing the resistor on the negative (cathode) side of the LED. So don’t sweat it, just pick a side!

We have connected the Green LED with the resistor on the Cathode, while the Red LED is connected with the resistor on the Anode side of the circuit. The red wire brings the power in, the grey wire connects it to ground, just note the resistor on each side of the LED!

Have A Question?
If you have any questions, or need further clarification please post in the comments section below; this way future users of this tutorial can see the questions and answers!

Parts Used In This Tutorial:


  1. John

Leave a Reply