A device’s power source and power management are one of the most important parts of any electronic product. After all, without a stable voltage source that can supply enough current for the circuit to work properly, there is no life. In the last decade, there has been an exponential increase of electronic products that have a battery embedded into them. So the need of knowing the different battery types so you can select the correct one for your application is required more than ever.
There is a very good website called Battery University which I highly recommend if you want to get deep into batteries and understand them wholly. One of their most useful resources is the following table that compares the different battery technologies and has important design parameters such as internal resistance and operating temperature that can later be used to do a WCA:
Sustainability parenthesis – think twice before selecting a NiCd as they have been banned in the EU for their high concentration fo Cadmium as it does not comply with the RoHS directive explained in the sustainable electronics article:
Ni–Cd batteries contain between 6% (for industrial batteries) and 18% (for commercial batteries) cadmium, which is a toxicheavy metal and therefore requires special care during battery disposal. In the United States, part of the battery price is a fee for its proper disposal at the end of its service lifetime. Under the so-called “batteries directive” (2006/66/EC), the sale of consumer Ni–Cd batteries has now been banned within the European Union except for medical use; alarm systems; emergency lighting; and portable power tools. This last category has been banned effective 2016. Under the same EU directive, used industrial Ni–Cd batteries must be collected by their producers in order to be recycled in dedicated facilities.
Cadmium, being a heavy metal, can cause substantial pollution when discarded in a landfill or incinerated. Because of this, many countries now operate recycling programs to capture and reprocess old batteries.
A battery needs to be charged according to its voltage, current capacity and material used. Connecting the battery straight to the power source could damage the battery and considerably decrease its life cycle. For this reason, a circuit must be designed which can manage the battery charging. Alternatively a battery charger IC can be a viable option.
There are different ways of charging a battery. Each different method is used for a different type of material. A common parameter in battery charging is the C rate, which is a way of expressing how much current is being used in comparison with the rated current capacity of the battery. For example: if a battery is charged at 0.3C for a 600mAh battery, then the charger will constantly output 180mA.
Iout = 600mAh*0.3 = 180mAh
The following table compares different charging methods with different battery types and C rates:
This is an amazing and very informative video from Afrotechmods that explains in more detail and with clarity the concept of Ah and C rate:
Even though a battery power source is a DC source, it still needs to be regulated in order to reduce ripple caused by spurious current bursts and isolate it from the rest of the electronics in the circuit. A typical approach is to use a voltage regulator, which produces a steady voltage source, capable of dealing with supply ripples. Voltage regulators are mainly divided into two categories:
A linear regulator operates by using a voltage-controlled current source to force a fixed voltage to appear at the regulator output terminal. The control circuitry must monitor (sense) the output voltage, and adjust the current source (as required by the load) to hold the output voltage at the desired value.
Linear regulators subdivide into Low Drop Out (LDO) and Standard. The main difference between both is dropout voltage, which is defined as the minimum voltage drop required across the regulator to maintain output voltage regulation. A critical point to be considered is that the linear regulator that operates with the smallest voltage across it dissipates the least internal power and has the highest efficiency. The LDO requires the least voltage across it, while the Standard regulator requires the most.
A switching regulator converts the DC input voltage to a switched voltage applied to a power MOSFET or BJT switch. The filtered power switch output voltage is fed back to a circuit that controls the power switch on and off times so that the output voltage remains constant regardless of input voltage or load current changes. Typical topologies for switching regulators include the buck, which converts a higher input voltage into a lower output voltage and a boost which converts a lower input voltage into a higher output voltage.
The following table taken from Maxim Integrated Application note 751 compares Linear with Switching regulators:
In this short article, different battery technologies were compared and the basic charging principle was explained. We also talked about power regulation and the main two different ways to regulate a voltage power source in a circuit. Next article will focus on an example of how to design a simple battery charger and power regulation circuit for a portable product.
If you liked what you read please share your email address with me and I will keep sending you special content to help you take your design from concept to product 🙂
C. Simpson, “Linear and Switching Voltage, Application Note: SNVA558,” Texas Instruments
Analogue Devices, “Understanding How a Voltage Regulator Works,” 2009
Maxim Integrated, “Application Note 751: Linear Regulators in Portable Applications,” 2002
This Post Has One Comment
Nicely explained really liked your blog. Keep writing & spreading.