Current Monitoring Devices for Battery Management Systems in Electric Vehicles

A battery management system (BMS) is an integral component of a modern-day battery pack. A BMS helps optimize the running conditions of a battery pack to maximize the performance and safety by monitoring variables such as temperature, voltage, and current. Here, we will explore why current monitoring is an important parameter in battery pack safety and go over a couple methods to see how they compare for applications in electric vehicles (EVs).

The flow of electricity in and out of a battery pack is an important parameter when considering the safety of the battery pack. As David Andrea put it in “Lithium-Ion Batteries and Applications”, a BMS that is not allowed to interrupt the battery current, either directly or indirectly, does not actually protect its battery and is therefore pointless. When looking at lithium-ion batteries (LIBs), we must consider that lithium-ion cells have a certain window of safe operating voltage. Going beyond the voltage limits of the lithium-ion cell in either direction can at best lead to cell degradation or could even lead to a compromised pack.

The voltage of the lithium-ion cell depends on which electrode is housing the lithium, and how much lithium is stored there. Due to the movement of lithium through the electrolyte and through the electrode, the instantaneous voltage of a cell when charging or discharging does not reflect the actual state of the cell. When charging a cell for example, if we apply a current until the cell reaches 4.3V then stop the current we will see the cell potential relax below 4.3V. This effect is known as voltage hysteresis, and it is a good example of how a BMS could benefit by considering more variables than pack voltage.

A proper BMS is required to know the ranges of voltage, current, and temperature necessary for optimal cell performance, these variables differ greatly with the choice of cell chemistry. In our charging example above, a proper BMS would account for voltage hysteresis and allow for more charge to be stored before stopping the current. In the case of EVs, this more careful consideration of cycling would result in more range from a LIB with the same health.

Another safety consideration with regards to monitoring current is the cells operating temperature range. Extreme temperatures can lead to rapid cell failure or even thermal runaway which can be catastrophic. Lithium-ion cells have inherent internal impedance, moving current through the cell results in Joule heating. This Joule heating, also known as resistive or Ohmic heating, means that cells will heat up during charging or discharging.

This of course is not limited to the individual cell and therefore spreads heat throughout the cells in a battery pack. Different cells are rated to handle various levels of current, and aside from the manufacturing limit that exists, we must consider the heating from running the cells at different currents. A proper BMS needs to be able to vary the current limits given manufacturing specifications and running conditions.

With these considerations in mind, when choosing a current sensing device for a proper BMS, where does one begin?  The accuracy and reliability of a current sensing device will depend on factors such as, the connections, filters, converters, and isolation from other signals. Another factor will the be current sensor type itself. Here we will look at two popular types of current sensors, the Hall-effect sensor and the current shunt. Both have their benefits and disadvantages, but in the case of EVs, some applications may be better suited for one or the other.

A current shunt is essentially a resistor that you run the current through and measure the voltage drop. If the value of the resistor is known, we obtain the current from a rearrangement of Ohm’s law (V=IR).

I = V/R

The resistance of the current shunt is low, as to not waste too much energy by running the current through it. However, this leads to the first shortcoming of the current shunt. Because the current of our battery is running through a resistance, we inevitably lose energy in the form of heat from Joule heating. This energy lost is low, but in some applications, it should come into consideration. Through Joule heating of the sensor, we could end up with more unwanted heat in our system when using this method of current sensing.

The accuracy of the current shunt is relatively thermal resistant, and one of the best properties of the current shunt is its value. Current shunts are inexpensive, simple, and come in a variety of sizes. The simplicity of the shunt gives it high reliability, and they can generally obtain high accuracy. The accuracy can be around ~ 0.1-0.5% in some cases. Another huge advantage of the current shunt’s simplicity is the fact that it requires no external power source. This is great for EVs since no energy is wasted on powering the device.

Hall-effect sensors operate with a slightly more complex mechanism, to understand this system we must first look at the Hall-effect itself. The Hall-effect is when a voltage difference arises in an electrical conductor which is in the path of an induced magnetic field perpendicular to the electric current in the conductor. As a result of the induced magnetic field being proportionate to the pack current, measuring the voltage change that will occur when current flows from the pack through the sensor enables accurate measurement of the pack current.

The Hall-effect sensor has the advantage of being isolated from the high current path. Because the current isn’t being forced through a device like a shunt, there isn’t significant energy loss in the form of heat. Unlike the shunt, the Hall-effect senor requires external power to operate. Depending on the application, the energy cost of operating the Hall-effect sensor can be much lower than the loss of energy via Joule heating.

The design of the Hall-effect sensor comes with some unfortunate shortcomings that need to be considered. External magnetic fields have the ability to skew the results of Hall-effect sensors so they must be shielded from outside magnetic fields during installation. The signal is also very vulnerable to temperature-induced impacts, meaning that fluctuations in temperature can affect the accuracy of the measurements. These shortcomings can make it difficult to implement the Hall-effect sensor into some applications and could result in increased cost of installation.

There are two main types of Hall-effect sensors, the open-loop and closed-loop varieties. An additional coil is surrounding the magnetic core in the closed-loop design. Closed-loop sensors are larger, more expensive, and use much more power, but they are more accurate, quicker to react, and less prone to magnetic fluctuation. In automotive systems, both variations could fit certain applications. However, open-loop sensors are frequently employed in automotive applications because of their cost sensitivity.

As discussed above, accuracy and reliability of current sensing is critical for the utilization of LIBs in EVs. The two options of current sensors that were discussed both come with a list of advantages and disadvantages. This means that in the EV field there is no “one size fits all” current sensing solution. It is important to exercise caution while deciding which current sensing technology to use in a given application. Numerous factors need to be taken into account, including the application’s cost, the operating environment, and the accuracy needed.