Background AESTechno Expert Electronique

Embedded power management: making your battery last 3 years and beyond

Optimise IoT battery life: energy budget, sleep modes, DCDC vs LDO, battery chemistry. Practical guide by AESTECHNO Montpellier, electronic design house.

Embedded power management brings together battery chemistry, regulator choice, MCU sleep modes and low-power firmware to deliver 3 to 10 years of battery life. At AESTECHNO, an electronic design house based in Montpellier, we size the energy budget from the technical specification stage and validate at the bench (Nordic PPK2 from 200 nA to 1 A, Keithley DMM7510 7.5 digits), because an extra 2 µA of sleep current can halve real-world autonomy.

Key takeaways

  • Sleep current targets: below 1 µA for any 3-year goal, below 300 nA for 10-year industrial sensors. A Low Dropout regulator (LDO) with 50 µA quiescent current kills a 3 µA MCU budget before firmware runs.
  • Regulator choice: a Direct Current to Direct Current (DCDC) buck hits 85 to 95% efficiency, an LDO hits V_out / V_in (78% for 3.3 V out of 4.2 V in). Hybrid DCDC + LDO post-regulation is the reference topology.
  • Battery chemistry per IEC 61960 and IEC 60086: Lithium Thionyl Chloride (LiSOCl2) at 500 Wh/kg for -55 to +85 °C industrial, Lithium Iron Phosphate (LiFePO4) at 120 Wh/kg for 2000 to 5000 cycles, Li-ion Nickel Manganese Cobalt (NMC) at 250 Wh/kg for rechargeable wearables.
  • MCU floor (datasheet): Nordic nRF52832 System OFF 0.4 µA, STMicroelectronics STM32L4 Stop 2 at 280 nA, Nordic nRF54L15 System OFF at 150 nA, Espressif ESP32 deep-sleep 5 µA.
  • Validation tooling: Nordic PPK2 for 200 nA to 1 A dynamic profiles, Keithley DMM7510 for sub-nA quiescent floor. A soak test of several weeks closes the loop against the calculated budget.
  • Firmware lever: moving a BLE advertising interval from 100 ms to 1000 ms divides radio consumption by 10x. Duty cycling almost always beats MCU choice on total autonomy.

Contents

Battery life is not a guess: it is calculated, measured, and validated. A 2 µA gap in sleep mode can be the difference between a product that lasts 5 years and one you replace after 18 months. This discipline relies on public battery-chemistry standards, notably IEC 61960 (Li-ion cells) and IEC 60086 (primary cells), which define nominal capacity, temperature range, and self-discharge. This article condenses 10+ years of experience into a practical guide for any technical director designing a battery-powered embedded product.

Our project portfolio covers a wide energy spectrum: products built around batteries and BMS (Battery Management Systems), from coin-cell wearables (225 mAh) to high-capacity lithium-ion packs with active cell balancing. Our expertise spans low-power supplies (quiescent currents below 60 nA) up to high-voltage (10 kV) and high-power supplies, which gives us a cross-cutting view of conversion, isolation and safety constraints.

Why trust AESTECHNO?

  • 10+ years of expertise in low-power electronic design
  • Battery-powered IoT products in production with multi-year service life
  • Mastery of nRF, STM32 and NXP platforms in deep-sleep modes
  • Real-world bench measurement with Power Profiler and precision analysers
  • French design house based in Montpellier, hardware and firmware

Article written by Hugues Orgitello, electronic design engineer and founder of AESTECHNO. LinkedIn profile.

The energy budget: the method that drives every other choice

The energy budget is the foundational calculation that determines a product’s battery life. It models the system’s full consumption profile, active current, sleep current, duration and frequency of each phase, then matches it against the battery’s real capacity, factoring in temperature margins, self-discharge, and ageing.

The principle is simple:

  • Active current × active duration per cycle: the energy consumed during wake, measure, compute and transmit phases
  • Sleep current × sleep duration: the energy consumed while the system sleeps between cycles
  • Average current = weighted sum: Iavg = (Iactive × tactive + Isleep × tsleep) / (tactive + tsleep)
  • Battery life = battery capacity / average current: C (mAh) / Iavg (mA) = service life (hours)

Concrete example from a recent client project. On a recent project we measured an IoT sensor waking every 10 minutes, 200 ms active at 15 mA, 599.8 s asleep at 3 µA. Charge per cycle = 1.33 µAh, period 600 s → average current ≈ 8 µA. With an ER14250 (1,200 mAh): 150,000 h ≈ 17 theoretical years.

Margins to factor in: self-discharge (~1 %/year for LiSOCl2 → -10 % over 10 years), -20 °C derating (-30 to -50 % on lithium thionyl), ageing (rising internal resistance), and an overall safety factor of 0.6 to 0.7 against nominal capacity. With a 0.65 factor, our sensor lasts ~11 years. But going from 3 µA to 30 µA of sleep current (a misconfigured GPIO, a high-quiescent regulator) drops real-world life to 2.3 years, which is why every microamp matters.

Sleep modes: from Idle to System OFF

The microcontroller’s sleep modes determine the system’s baseline current when it does nothing, that is, 99 % of the time in a typical IoT product. Understanding the differences between Idle, Deep Sleep, System OFF and Hibernate is essential to size the energy budget correctly and pick the right wake-latency vs. consumption trade-off.

Idle / Sleep (~100 µA): CPU stopped but peripherals and HF clocks still active. Wake is near-instant (µs). Insufficient for long battery life.

Deep Sleep / System ON (~1-10 µA): RAM retained, RTC active, fast clocks off. Wake in hundreds of µs. Firmware resumes where it left off, the most common mode for IoT sensors.

System OFF / Shutdown (~0.3-1 µA): everything off except the wake source (RTC, GPIO, watchdog). RAM is lost → reset on wake. On the nRF52832, the ratio between System ON (1.5 µA) and System OFF (0.3 µA) is 5×.

Hibernate: the deepest mode on recent MCUs (nRF54, STM32U5, NXP LPC55), with current under 100 nA and only a few wake-capable pins. For products that sleep for hours or days between activations.

Datasheet figures for low-power MCUs:

  • The nRF52832, according to Nordic Semiconductor datasheet (nRF52 product line), draws 0.4 µA typ. in System OFF with no RAM retention, 1.5 µA with 16 KB RAM retention
  • The STM32L4 Stop 2 mode, per STMicroelectronics datasheet DS11453 (STM32L4 product page), draws 280 nA typ. at 25 °C with active Real-Time Clock (RTC)
  • The ESP32 deep-sleep floor, according to Espressif technical reference, is 5 µA typ. with RTC timer (high floor due to integrated Wi-Fi/BLE System on Chip (SoC))
  • The nRF54L15, according to Nordic Semiconductor product brief, reaches 150 nA in System OFF with RAM retention (2024 generation)

This dispersion explains why the MCU choice is the first lever for autonomy on a 24/7 connected product: an ESP32 at 5 µA in sleep consumes 12× more than an STM32L4 at 280 nA. Over a 10-minute duty cycle, the cumulative gap maps directly onto the energy budget.

Wake sources:

  • RTC timer: scheduled periodic wake, the most common source for duty cycling
  • GPIO interrupt: wake on external event (button press, sensor signal, motion detection)
  • Watchdog: safety wake on system lock-up

The critical point: the gap between System ON and System OFF is often a 10× factor in sleep current. For a product that sleeps 99.97 % of the time (waking every 10 minutes for 200 ms), it is this sleep consumption that dominates the energy budget. Going from 3 µA to 0.3 µA of sleep current can double real-world autonomy.

Voltage regulators: DCDC vs LDO

A voltage regulator is the analogue block that turns raw battery voltage into a stable rail for the MCU and sensors. Its topology choice alone dictates 20 to 40% of the total energy budget. The voltage regulator converts battery voltage into a stable supply voltage for the MCU and peripherals. It is a critical component of the energy budget: a poor regulator choice can waste 20 to 40 % of available energy before the microcontroller does anything. The choice between LDO and DCDC, or a combination, depends on target efficiency, tolerable noise, and load profile.

LDO (Low Dropout): simple, quiet (no switching), efficiency = Vout/Vin. With a 3.3 V output from a 4.2 V LiPo, efficiency = 78 % (22 % dissipated as heat). Quiescent current typically 1 to 50 µA. Ideal for sensitive analogue and RF circuits.

DCDC buck: 85 to 95 % efficiency over a wide Vin range. Generates switching ripple (analogue/radio disturbance), needs an inductor, capacitors, and a careful layout. Mandatory when Vin-Vout is large (3.6 V cell → 1.8 V MCU). DCDC boost: needed when Vbatt drops below VMCU (end-of-life lithium cell at 1.2 V → 1.8 V MCU); extracts the full battery capacity.

Hybrid approach (our recommendation): DCDC on the main rail (maximum efficiency) + LDO post-regulation on analogue/RF blocks (the LDO filters the DCDC ripple).

PWM vs PFM: modern DCDC regulators offer two operating modes. In PWM (Pulse Width Modulation), the switching frequency is fixed, good for high loads and radio operation (predictable spectrum). In PFM (Pulse Frequency Modulation), the regulator only switches when needed, excellent efficiency at very low load, but unpredictable noise spectrum. The optimal strategy: PFM in sleep, automatic switch to PWM during radio bursts.

Ultra-low quiescent current: latest-generation regulators reach remarkable quiescent currents. The TPS62840, according to Texas Instruments datasheet SLVSDQ3 (TPS62840 product page), draws less than 60 nA at rest. The MAX38640, per Analog Devices product specifications (MAX38640 datasheet), goes below 330 nA. At these levels, the regulator itself contributes almost nothing to the sleep energy budget, the MCU and sensors dominate.

Battery chemistry: Li-ion vs LiFePO4 vs LiSOCl2

Battery chemistry is the electrochemical formulation that determines capacity per gram, operating voltage, temperature window and shelf life, and it is the single choice that sets the upper bound on battery life. The battery choice determines not only the available energy capacity, but also operating voltage, pulsed current capability, temperature range and shelf life. Each chemistry comes with specific trade-offs, and a poor choice can drastically cut real-world autonomy versus the theoretical numbers.

Chemistry comparison (gravimetric energy density per IEC 61960 / IEC 60086):

Chemistry V nom. Energy density Cycles / life Temperature range Typical use
Li-ion (NMC / NCA) 3.7 V ~250 Wh/kg 500 to 1000 cycles -20 to +60 °C USB-C PD rechargeable wearables
LiFePO4 3.2 V ~120 Wh/kg 2000 to 5000 cycles -20 to +60 °C Long-life stationary industrial
LiSOCl2 (thionyl) 3.6 V ~500 Wh/kg non-rechargeable, ~1%/yr self-discharge -55 to +85 °C 10+ year industrial meters and sensors
Alkaline (Zn-MnO2) 1.5 V ~150 Wh/kg non-rechargeable, ~2%/yr self-discharge -20 to +54 °C Low cost, unstable under pulsed load

Common formats: CR2032 (225 mAh, 3.0 V, pulse limited to 15-20 mA, 100-470 µF cap mandatory for BLE/LoRa), ER14250 half-AA (1,200 mAh, 3.6 V, -55/+85 °C, passivation effect to manage at brown-out), ER14505 AA (2,600 mAh, standard for industrial meters/sensors), AA Energizer L91 lithium iron disulfide (~3,000 mAh, 1.5 V, peaks of hundreds of mA), rechargeable LiPo (with BQ25170 or MCP73831 + over-charge/deep-discharge protection). Fuel gauges like MAX17048 or BQ27441 are now standard on any rechargeable pack to estimate state of charge to the millivolt.

Primary vs rechargeable: for an IoT product expected to last 3 to 10 years without intervention, primary cells remain the most reliable choice (no charging circuit, no cyclic degradation). Rechargeables only make sense if the product has an ambient energy source or regular user access. The IEEE 1725 and IEEE 1625 test frameworks for mobile and portable cells are a useful reference point when writing acceptance criteria for rechargeable pack validation.

Temperature derating: the most frequent trap. At -20 °C, a lithium thionyl chloride cell loses 30 to 50 % of its nominal capacity. A CR2032 can lose up to 70 % at -30 °C. If your product is deployed outdoors or in cold storage, the energy budget must be calculated at the minimum operating temperature, not at 25 °C.

Firmware optimisation: the techniques that matter

Firmware-level power optimisation is the set of software techniques that lower average current without touching the bill of materials, chiefly aggressive duty cycling, peripheral gating, event-driven architectures and careful clock management. It is the conductor of energy consumption: even with perfectly optimised hardware, badly written firmware can multiply consumption by 10 or more. Firmware optimisation rests on one principle, minimise time spent in active mode and maximise time spent in the deepest possible sleep mode.

Duty cycling, the fundamental cycle:

The basic IoT sensor pattern is: wake → measure → process → transmit → return to sleep. Every millisecond saved in active mode translates directly into months of extra battery life. In our practice we have found that duty-cycle optimisation is often the most effective lever, far more than the choice of MCU. In our lab we profiled the same nRF52833 firmware with two scheduler implementations: the event-driven version spent 0.08% of wall-clock time active, the polled version 1.2%, a 15x gap that no hardware change can recover.

On a recent project, we integrated the low-power STMicroelectronics IIS3DWB accelerometer as the primary wake source: the MCU stays in System OFF until the sensor detects significant motion, and the IIS3DWB itself wakes the system through a hardware interrupt. This kind of “wake-on-event” architecture is a powerful lever to hit multi-year battery life in vibration-monitoring or shock-detection applications.

Peripheral power gating: physically switch sensors off between measurements (MOSFET or enable pin). A temperature/humidity sensor left powered draws 1 to 50 µA, even in “standby”, datasheets are sometimes optimistic.

BLE optimisation: moving the advertising interval from 100 ms to 1,000 ms divides radio consumption by ~10x. Tune interval, slave latency and supervision timeout. BLE 5.0’s coded PHY extends range without raising transmit power, per the Bluetooth SIG Core Specification v5.4. See our article on embedded Bluetooth BLE.

Clock management: use the 32 kHz RC oscillator whenever possible, and shut down the high-frequency crystal (16/32 MHz) after processing. On Nordic, the gap between active HFCLK and LFCLK alone is several milliamps.

Avoid busy-waiting: handle every delay through a timer + interrupt, never an active loop. Adopt an event-driven architecture (sleep → IRQ → process → sleep). Zephyr or FreeRTOS RTOSes make this approach natural (tickless idle).

Batching: the radio is the most expensive subsystem. Buffering 6 measurements in RAM and transmitting hourly cuts radio wakes by 6× without penalising effective data freshness.

Measurement and validation: never trust the calculation alone

Real-world measurement is the step that separates a lab prototype from a reliable field product. Theoretical calculations give an estimate; only measurement confirms that hardware and firmware behave as expected. In our lab at AESTECHNO we have seen 3x to 10x gaps between expected and measured sleep current, usually due to a misconfigured peripheral or a forgotten pull-up. On a recent project we measured a 12 µA parasitic current caused by a single pull-up left active on a debug pin; the fix cut the average current by 40% and pushed real-world battery life from 2.1 years back to the calculated 3.8 years.

Nordic Power Profiler Kit II (PPK2):

  • Real-time current measurement, from 200 nA to 1 A, with a 100 kHz sample rate
  • Full consumption-profile visualisation: every phase visible (wake, measure, transmit, return to sleep)
  • Source mode (supply) or ammeter mode (in series)
  • Reference tool for Nordic platforms, but works with any MCU

Joulescope: power analyser with 9 decades of current dynamic range, simultaneous I/V/energy measurement, data export for analysis. Ideal for long-duration measurements and complex profiles.

Tektronix Keithley DMM7510 (7.5 digits), PPK2 companion: picoamp-level current resolution, ideal for deep-sleep currents below 200 nA (the PPK2 floor). At AESTECHNO we systematically combine the PPK2 (dynamic RF profile) with the DMM7510 (quiescent floor at the pA). This dual approach prevents autonomy estimation errors caused by a poorly characterised sleep floor.

Standard multimeter limits: a 4-5 digit multimeter is too slow for pulsed currents, a 5 ms BLE burst at 15 mA is averaged with sleep and gives a misleadingly low reading. It captures neither shape nor duration of the peaks.

Oscilloscope + 1-10 Ω shunt in series with the supply: excellent for characterising pulses (shape, duration, amplitude). Long-duration soak test: run the prototype for several weeks under realistic conditions, compare measured discharge against the calculated budget; a gap above 20 % requires diagnosis before production.

The AESTECHNO approach: from specification to field validation

Energy management structures the entire project, from battery choice to firmware. At AESTECHNO we integrate the energy budget into the technical specification and validate it through real measurements before each milestone. Our approach rests on three pillars: (1) energy budget at specification time (feasibility validation before hardware design); (2) bench validation (current characterisation in each mode, comparison against the theoretical budget, corrections before the next milestone); (3) end-to-end stack mastery at our electronic design house in Montpellier, hardware, RF circuit and firmware designed together. We master ultra-low-power regulators, deep-sleep modes on nRF/STM32/NXP, and LPWAN / cellular IoT connectivity.

Should you add energy harvesting?

Energy harvesting is the family of techniques that capture ambient energy (indoor light, outdoor solar, thermal gradients, vibration, stray radio frequency) and convert it into usable electrical power for an embedded product. Harvesting references include the LoRa Alliance application notes on solar-powered end devices and the ISO 14064-compliant energy reporting used in long-life industrial deployments. Energy harvesting recovers ambient energy (light, heat, vibration, RF) to extend, or replace, the battery. It is justified when the ambient source covers the average energy budget with margin: target a 3× factor on average power consumed, to absorb cloudy days, low temperatures or intensive cycles.

  • Indoor solar (LED/fluorescent ~200 lux): ~100 µW/cm² with an amorphous cell. A 500 lux office reaches 300 µW/cm². Sufficient for a BLE sensor with a 0.1 % duty cycle.
  • Outdoor solar in full sun: ~15 mW/cm² with a monocrystalline cell. Sufficient to power a LoRaWAN node with a LiFePO4 buffer battery.
  • Thermoelectric (ΔT = 5 °C): ~10-50 µW/cm² with a miniature Peltier module. Niche use: hot-pipe monitoring.
  • Piezoelectric (vibration): typically 10-100 µW on industrial structures at 50-200 Hz. Power source for a self-powered vibration sensor.

Dedicated Power Management Integrated Circuits (PMICs) handle extraction and recharging of a buffer cell (supercap or Li-ion). The BQ25570, according to Texas Instruments datasheet SLUSBH2 (BQ25570 product page), performs Maximum Power Point Tracking (MPPT) from an input as low as 330 mV, which is what turns a poorly lit indoor panel into a viable buffer charge. The LTC3588, per Analog Devices documentation (LTC3588 datasheet), integrates the piezo rectifier, dropping the external Bill of Materials (BoM) count. The e-peas AEM10941 targets indoor solar specifically. Despite the appeal of “free” power, energy harvesting adds complexity and cost: reserve it for products where battery replacement is impossible (poured concrete, implant, sensor at height) or unacceptable (10+ years of contractual autonomy).

In short: power management in 5 principles

Embedded power management is a discipline that balances five measurable levers against one battery budget. Embedded power management follows five measurable principles: (1) model the energy budget before picking an MCU; (2) target sleep current below 1 µA for any 3-year-or-more goal, using System OFF / hibernate depending on the MCU; (3) prefer an 85-95 %-efficient DCDC with LDO post-regulation for sensitive blocks; (4) choose chemistry by thermal constraint, LiSOCl2 (500 Wh/kg) for -55/+85 °C industrial, Li-ion (250 Wh/kg) for rechargeable wearables; (5) validate at the bench with the Nordic PPK2 (dynamic) and the Keithley DMM7510 (sub-nA floor).

At AESTECHNO, we apply this discipline from specification onward and validate every prototype at the bench before production. Theoretical calculations give a direction; only measurement validates real autonomy, and only measurement holds up against field returns after 24 months of deployment.

Need your product to last years on a battery?

Designing a battery-powered IoT device that has to deliver 3, 5, or 10 years of autonomy? Our engineers help you with:

  • Complete energy budget from the technical specification stage
  • Optimised battery, regulator and sleep-mode selection
  • Low-power hardware and firmware design
  • Bench validation through real measurements (PPK2, Joulescope)
  • Technologies: BLE, LoRaWAN, NB-IoT, LTE-M

Discuss your project, 30-minute audit (free)

FAQ: common questions about battery power management

The questions below are the ones that recur most often on client projects where battery life is a contractual requirement.

What battery life can you expect from a CR2032 in an IoT sensor?

With a 225 mAh capacity and a 10 µA average current, a CR2032 lasts about 2.5 years in theory, less in practice due to temperature derating and pulsed-current limits. For lifespans above 3 years, we recommend an ER14250 (1,200 mAh) or ER14505 (2,600 mAh) lithium thionyl chloride cell.

What is the difference between System ON and System OFF on a Nordic MCU?

In System ON with RAM retention, the nRF52832 draws about 1.5 µA: RAM is retained and the RTC keeps running. In System OFF, consumption drops to 0.3 µA, but all RAM is lost, the system reboots completely on wake. The choice depends on your needs: if firmware must resume exactly where it left off, System ON is required.

DCDC or LDO: which one for a battery-powered product?

To maximise autonomy, a DCDC buck is almost always preferable to an LDO: its 85-95 % efficiency far exceeds the LDO’s Vout/Vin ratio. However, DCDC switching noise can disturb analogue measurements and the radio. The optimal solution is often hybrid: DCDC for the main rail, LDO post-regulation for sensitive blocks.

How do you measure a sleep current of a few microamps?

A standard multimeter is insufficient for dynamic measurements. Use a Power Profiler (Nordic PPK2) or a power analyser (Joulescope) that samples at high frequency and captures the transitions between sleep and active mode. For fast pulses, an oscilloscope with a shunt resistor (1-10 Ω) lets you characterise the shape and duration of current peaks.

Does firmware really impact battery life?

Yes, often more than the hardware choice. A GPIO configured as a high-output instead of high-impedance input can add tens of µA. A peripheral not disabled in sleep, a busy-wait instead of an interrupt, an over-frequent BLE advertising, each of these errors can halve autonomy or worse. Firmware is the conductor of consumption.

Can AESTECHNO design a low-power IoT product?

Yes. AESTECHNO designs battery-powered IoT products with multi-year service life, from specification to bench validation. We master nRF, STM32 and NXP platforms, ultra-low-power regulators, and BLE, LoRaWAN, NB-IoT and LTE-M protocols. Our Montpellier design house designs hardware and firmware together to optimise the system as a whole.

Related