By Chris Turner, Director of Battery Technology and Lon Schneider, Director of Electrical Engineering
OEMs and design engineers commonly consider the power supply, battery pack, and charger as independent subsystems rather than incorporating them a unified system into the design process. This approach to product development often creates challenges that lead to larger, inefficient rechargeable subsystems. These systems are interdependent on one another, and developing them as a single system serves to reduce complexity, support use of the latest technologies, maximize performance, reduce product development risks, address regulatory considerations early in development, and increase speed to market.
Additionally, the total system approach creates opportunities to design products that provide the maximum levels of power, efficiency, and tight sizing requirements, while still allowing for architecture simplification by integrating battery and load management functions into a single microcontroller.
Total System Approach: Data Server/Storage Industry
The total system approach to design is critical for the IT industry as industry needs and regulations demand increasingly scalable, efficient, and reliable power system solutions. It provides engineering teams with the freedom to adapt their solutions to the needs of the application. Data storage and server applications are examples of applying the total system development approach. Not only are there power system related concerns, but also the interaction with the host device is critical to meeting the long-term needs of these applications.
Many data-storage server applications have fundamental requirements that are common to all in that they are backup applications where the battery is not cycled very often, and as a result, the battery is held near full charge (high-voltage) all through its life.
Additionally, the server environment is known for its steady elevated temperatures (ranging from 35° to 55°C). The combination of high voltage and elevated temperatures is detrimental to the life of a lithium-ion battery and can result in rapid degradation. Add to this the data-storage-server market expectation of a minimum of three years from the battery before replacement — in some cases, up to five years is being requested. A total system approach is necessary in order to mitigate the degradation over time and optimize the battery for long calendar life.
Cell chemistry selection is the first important consideration in these applications. Having data from months (if not years) of testing is needed in order to make the decision. Certain lithium-ion chemistries such as lithium iron phosphate (LFP) have proven very durable in data storage environments and it is becoming the de facto choice for many data server applications. Other more conventional lithium-ion chemistry variants have also proven very durable and specific knowledge of these is critical in order to meet the needs of this market. The choice between conventional lithium-ion and lithium iron phosphate is often driven by the discharge rate needs of the data storage device with the higher power requirements typically leading to LFP as the choice.
From the system design point of view, charge management techniques are also very important. In these applications, the charge voltage should be lowered to extend the life, but the frequency of re-charge also needs to be seriously considered. The decisions on recharge frequency are made in concert with the customer’s host device development team to ensure that the runtime needs of the device are met should the AC power temporarily be lost.
Fuel gauge development is also critical to the proper operation once installed. Backup applications are particularly difficult for fuel gauges. Any fuel gauge requires some discharge to occur over time so that it can re-evaluate certain parameters that it uses to determine state of charge and state of health. The chemistry, whether conventional lithium-ion or LFP, can further complicate the work of the fuel gauge.
Typically the method for dealing with this is to have a re-learn cycle at a given interval (typically some months). While it is not necessary to fully discharge the battery in order to perform this relearn cycle, depending on the fuel gauge and chemistry, as much as 30-60% discharge is required. The concern is that the battery could be in an unacceptably depleted state should a power outage occur at the host device. This requires close interaction with the customer’s team during development.
Thermal issues particularly in high power server applications are another area of concern. Thermal modeling and cooling studies are employed very early in the project to ensure proper airflow and cooling of the backup system (battery) to reduce thermal gradients across the battery. With elevated ambient temperatures and high discharge rates, the temperatures can become excessive without proper cooling. Early in the project, thermal studies are done with the customer in order to ensure their host device design provides adequate cooling. These studies can result in changes both electrical (e.g. relocating of FETs and other heat producing components) and mechanical (altering the battery pack to allow better airflow).
Backup systems must deal with the issue of the voltage range at which energy is stored versus the range of voltage needed to back up the system. The first approach that comes to mind is often to plan on including a switch mode regulator to change the battery voltage to a level that can power the device. That can mean bucking or boosting (lowering or raising) battery voltage to the needed rail level, or, if the range of battery voltage straddles the needed input level, applying a Buck-Boost regulator. However, even though efficiency levels for these regulators can be relatively high, they still generate heat and waste battery power.
Where some backup systems must supply many hundreds of watts during a power failure, such losses are not insignificant. Instead, by selecting the optimum type and arrangement of cells the need for a regulator may be eliminated. Battery power can be directly “OR-ed” or switched into the power input stage when AC fails. “Ideal Diode” OR-ing circuits can greatly reduce the dissipation of what might otherwise be a high-power OR-ing diode.
The Role of Lithium-Ion
Lithium ion batteries have become the preferred power source for IT back up systems due to their small size and light weight. With a higher energy density than older chemistries, and a low self-discharge rate, lithium-ion battery packs have revolutionized the design of electronic devices that were previously restricted by power source size, weight, and run-time limitations.
By leveraging this technology into the total system design, the requirements of conditions typical of data storage and server applications that need the battery to be kept at or near full charge — and at elevated temperatures — for extended periods can be met.
As regulations and requirements continue to become more and more stringent, OEMs and design engineers will need to innovate and create solutions that address the evolving needs of the data server industry. In order to do so, they will want to consider the power supply, battery pack, and charger as interrelated subsystems working together. Leveraging any other approach as charging technologies continue to rapidly evolve can lead to potential costly development and performance issues.
Chris Turner is the director of battery technology at ICCNexergy. His associate Lon Schneider is the director of electrical engineering.