Can Argon (or Xenon) Read the Battery State?

@Rftop, I stand corrected. Still missing from the docs though.

Nope, on the Boron this is not available as it has its own PMIC, so you don't need the poor-man's-SoC.
That was also the reason why only Xenon and Argon were mentioned

@ninjatill, 10-4, I didnā€™t mean for that to appear as if I was correcting you.

Iā€™ve been performing some preliminary testing with Boron LTE, publishing every 5 minutes on battery power, 2,000 mAH li-po, 2 watt solar panel. The SoC drops 20% over night. Recharges in ~ 1 hour of good sunlight.
Iā€™m using pmic.setChargeVoltage(4208);

3 Likes

ā€œfuelā€ is undefined on my Boron. Do I need to #include something or do some other setup?

yes:
FuelGauge fuel; at the Top

1 Like

OK, thanks! Iā€™m getting a number now (0.143750) but Iā€™m not sure what to make of it. I probably should just wait for the documentation.

The FuelGauge section is now present on the Boron. The documentation is exactly the same as the Electron; it was just an oversight that it wasnā€™t turned on for the Boron.

3 Likes

Does anyone have a clever little equation that I can plug in the voltage (or better the ADC value from PO.05) and get the SOC?
J

1 Like

Iā€™d be interested in this as well. I found some info on Adafruit but no equation.

This Discharge Profile graph might help:

In my opinion, the FuelGauge is pretty accurate.
Several assumptions are made, and you would have to make the same assumptions to calculate a SoC independently.

I think the misconception comes from the firmwareā€™s default ChargeVoltage at 4.11V for various safety reasons.
The SoC shouldnā€™t be mapped or re-calculated just because 4.11V is the Vmax selected for charging.
The SoC does a good job as it uses the known Li-On characteristics to approximate the available chemical energy at any give time. But 100% SoC will always require a terminating voltage near 4.21 for 3.7V Nominal Li-On types. We have the option to set the ChargeVoltage to 4.208V.

Most Li-Poā€™s are rated by calculating the mWh from discharing from ~4.22V down to 3.2V or sometimes 3.0V (per the individual datasheets). The mAh rating is calculated using the Defined Nominal Voltage (usually 3.7V) and the total mWh during the discharge test.

So take the 2,000 mAh (7.4Wh) Li-Po that we all use for instance, we will never realistically use all 2,000 mAh because weā€™re not going to discharge to 3.0V. We can consider at-least 20% as ā€œreserveā€ capacity if ya want.
But re-calculating a different SoC (if you stick with the Default 4.11V Charge) will be much less accurate than just using the FuelGauge.
For instance: if the FuelGauge reports 85% SoC, then that Li-po has approximately 85% of the maximum chemical energy remaining (if you discharged it all the way down to 3.0V). That % is basically independent of the mAh rating of the battery or the ChargeVoltage used.

A simplified calculation:
85% SoC reported for our 2,000 mAh Li-po = 1,700 mAh remaining
But we also would subtract the 20% ā€œreserveā€ that we never want to reach.
So instead of using the reported 85% Soc, reduce the SoC by 20%.
65% SoC of 2,000 = 1,300 mAh
(SoC - Reserve) * Battery mAh Rating = usable mAh remaining.
That will give you more meaningful data than calculating your own SoC from the Li-Po Voltage, in my humble opinion.

Again, the FuelGauge isnā€™t inaccurate. It actually does a great job.

1 Like

Will FuelGauge be coming to the Argon/Xenon?
Iā€™d be interested in reporting a rough battery level in percent and not volts. So any formula or class that can do this would be appreciated.

There is no fuel gauge chip on the Xenon and Argon. You can, however, read the battery voltage to determine the approximate charge level:

https://docs.particle.io/reference/device-os/firmware/xenon/#battery-voltage
Particle Battery Voltage

2 Likes

You can, however, read the battery voltage to determine the approximate charge level:

Very approximate! :slight_smile:

The battery voltage vs charge level is non-linear, and dependant on your battery specifications, current draw, battery age, and even the ambient temperature. See above discharge graph as an example. Without spending a lot of time characterising all the above for your use case, about all you can do is detect when the battery voltage drops below a certain point to trigger a low battery notification.

If you need better than that, then you should get a model with the fuel gauge. The fuel gauge monitors actual current flow in and out of the battery over time, so it is able to report something reasonably close to true current capacity.

Thanks, wasnā€™t aware there was a hardware component to the fuel gauge.

Unless someone sees a flaw in this approach I was thinking of monitoring the voltage over a few cycles of charge/discharge, graphing it in Excel and have it generate a ā€˜best fitā€™ formula, which would at least be accurate for my use case. So Iā€™d label the highest voltage as 100%, the lowest as 0% and let it map the rest. The other benefit is Iā€™d have time data so could translate 3.3v to 20% to 80min runtime (for my specific battery/load use case).

I think you are over complicating a battery State of Charge function. With certainty using simple bands (only suggestions below) and the battery voltage measurement (you may want to smooth this through a rolling average process) outlined above you can determine the following states:

  1. Charging ( V > 4.3)
  2. Full Charge (V >=4.2
  3. Nominal (4.2 > V > 3.5)
  4. Low/Needs Recharge (3.5 > V >= 3.4)
  5. Critical (3.4 > V)
4 Likes

Yep, the real question I wanted to know was how long the battery lasts so I can charge it in time. ie in a critical state do I have minutes or hours. My test of a xenon reading a SHT sensor every 5sec and publishing 60sec got me 1 week of run time with a 2500mAh Lipo. The graph is pretty linear up until the last ~12hrs where the voltage drops dramatically. Unfortunately at least in my conditions, it went from 3.51v (nominal) to 3.37v (critical) in ~2hrs. I wasnā€™t sure if Iā€™d harm anything by letting it run until it dies so I stopped at that point. So if I need at least 24hrs warning to charge the battery my threshold would be around 3.7v. Obviously this is only relevant for my specific use case, and once Particle adds the power saving modes, Iā€™d expect it to last much longer

In case anyone is curious, the formula I got for converting voltage to a charge/runtime based percent is 1.8699(voltage) - 6.6861
So a reading of 4.0 volts means I have roughly 79% of the charge left, and you could infer itā€™s been running for about 35hrs.

What is the current drain from the SHT sensor - pretty small? Run time very much in line with 15mA average current use for Xenon - 2500/24/7 = 14.88. Roll on sleep modes!

Yea, an average somewhere between 1.7uA and 47uA. Iā€™d have to check the library to see which mode is being used to know for sure.

I did find an answer to the question, ā€œCan anything bad happen if you let the battery die?ā€ Yes, on my Xenon I lost everything, it wouldnā€™t re-connect after power was restored. I had to go through the full setup process to restore it. It doesnā€™t happen every time, I had another that ran until the battery died that did not lose itā€™s configuration info.

Would I be right to assume that the battery read solutions mentioned above will not work on Xenon devices? Today, with the sensors I built, I report back voltage so I can alert when a sensor node needs new batteries.
Once deeper sleep is properly implemented, I intend to power from 2xAA and a Booster.

Above where?
There are some suggestions that actually are meat for the Xenon - that's the point of this thread :wink:

But for 2xAA cells and a booster will not be covered by that.