Where does that info come from?
This is what I’d read in the datasheet
Where does that info come from?
This is what I’d read in the datasheet
Aaah, I checked the VIN specs instead of Li+
Related question: what’s the minimum Li+ then? I see the recommendation, but would 3.3V be enough to power just the Electron (it’s okay if the modem is unpowered then). That would be great as the Electron then still could take measurements and send them later when VIN >= 3.6V
VIN < 3.6V: Take measurements only
VIN >= 3.6V: Take measurements and send them to server
There are some potential “brownout” issues when going too low, most prominent the dim blue D7 issue.
This forum has some threads regarding that and one common precaution is to keep the SoC above 20% but there are also some more elaborate approaches.
Is there something I have to do programmatically? Like configuring PMIC?
Or is Electron PMIC Settings Question a good starting point?
You can configure the PMIC (and that question is a good starting point) but for the start you can have some tests what you get out of your battery in the 20% to 85% SoC range and see whether you need to do any more than just monitor that and keep the from drawing any more power once it drops below 20%.
This post is already solved by ScruffR, but I’ll add the curve that I personally use to calculate State-Of-Charge for a single cell Li-Po (10,000 mAh) for an Electron that’s recharged by a 6 watt Voltaic Solar Panel.
The Rule-of-Thumb that I use is to Size the Battery and the Solar Panel with the goal to not cycle the Li-Po below 40% S.O.C during your “Design Day” conditions.
So your Energy Budget will assume that a Li-Po at ~3.8V is depleted.
Yes, this is conservative but prolongs the battery life and adds a safety margin.
Similar to what @ScruffR mentioned: For me, if Li-Po ever drops to 3.7V (20% S.O.C), it’s time to Sleep the Electron and wait for the Sun. However, if your Energy Budget is correct, you should never get near 20% S.O.C.
thanks for the hints, @Rftop
I think I’m only confused because the first thing you find when googling for “lipo discharge curve” is this curve by SparkFun / Prototalk:
For my LiPo they actually state a critical voltage of 2.5V which is absurd, but still, I think I could go down to 3.3V.
But yes, I’m unsure now, because your critical voltage is really “high”. But I always thought even 3V would be okay as that is what most LiPo charger ICs use as highest available cutoff voltage (most of them us 2.5V by default)
Here is the code I and others have used to put the Electron into deepsleep once the battery SOC hits 20%.
The code wakes up every hour to check if the solar panel has charged the battery above 20% and starts running normally once this happens and keeps sleeping if the SOC is still 20% or below.
The SOC% sleep number is adjustable so you can change it if desired.
@derchris, the SparkFun curve you posted isn’t related to the S.O.C curve I posted, at it’s discharging at 6C.
It’s not relevant to what you’re doing at all.
You shouldn’t allow your Li-Po to drop to 3.3V.
You are risking damage for no reason, reducing the expected life of the Li-Po, and it has very little available energy at that level.
I would be suprised that a 2G Electron could connect being powered only by a Li-Po @ 3.3V.
The 2G Electron demand while connecting could be higher than what’s chemically available in the Li-Po @ 3.3V.
Don’t get caught up with Critical Voltages listed in Li-Po technical specs. The Goal is to never operate anywhere near something named “Critical”.
IMHO, your CODE should start to reduce system demand (functionality) prior to reaching 20% SOC, and go to sleep if it hits 20% (see @RWB link/code). Or simply incorporate RWB’s code.
You know everything you need for a simplified Power Budget :
Watts = Amps * Volts
Idle Current ~57 mA (for 3G Electron) = 0.057 amps, not sure about 2G version.
24 hours in a day
Your Solar Panel Rating in Watts.
Assume 50% total efficiency for Solar Recharging (harvest Solar, get it into and back out of Li-Po, dirty panel, etc)
Assume the # of hours of Sunlight for your "Design Day"
Reserve 30% battery capacity, so your 15 Ah becomes 10.5 Ah for the Budget.
You check your Effective Solar Rating (watts) * # hours of expected Sunlight against your average system watt * 24 hours. If this isn’t acceptable, increase Solar Panel Size. For instance, you may decide you want 3 hours of sunlight to replace 24 hours of demand. Don’t forget that Effective Solar Watts = 1/2 of the Labeled Rating on the Panel.
Then check your 10.5 Ah Effective Battery Rating against your system Ah for 24 hours ( 0.057 amps * 24 hr example for Idle 3G only, use your total system’s real numbers) to determine how many cloudy days you can expect to operate without Solar Recharge. If this isn’t acceptable, increase battery size.
There are many other factors such as Li-Po operating temperature, Li-Po self discharge, etc, but this is a good starting point. Once you select the Solar Panel and battery size, test it in the real world to confirm your Power Budget Calculations. Use @RWB’s code for an additional safety factor…that part is free.
I guess there is an extra zero in the
0.0057 amps part
I’ll edit accordingly
BTW, the datasheet states this
That was on purpose to see who was actually reading… j/k
Those pesky little decimal places.
Good catch & thanks !
The industry accepted average output from a solar panel in the summer is 75-77 percent of its rated output. In the colder months you can see 100 percent of the panels rated power output because the solar cells stay 75 degrees or colder.
It’s the heat that reduces the solar cells power output power assuming you have proper sun alignment.
Sizing your system for 50 percent efficiency will not hurt since it will be oversized to get the job done but sometimes the this can push you into the higher priced component category which may be important to your bottom line.
The thing about lipo batteries and draining them is that they are usually only rated for 300-500 cycles before hitting 80 percent of their rated capacity. The less deeply you discharge the battery the more cycles you can get from the battery before needing to replace it.
Publishing data less is a good way to keep power consumption down if it’s possible in your case.
I got by with publishing every 20 seconds with the 2000mah 3g Electron battery and a 3w solar panel using the battery sleep code posted above and in the winter the system basically ran 24/7.
@RWB, I claimed 50% total efficiency for Solar Recharging to account for the additional losses:
That’s why I mentioned to use a “nice-round-even” 50% of the Labeled Rating for quick Power Budget Calcs.
Naturally, we would perform a full Power Budget for any production device (the reason I hinted at Self-Discharge, temperature profile, etc).
You made a Great comment about the cycle ratings. People sometimes don’t understand how much longer a Li-Po will last when treated properly (don’t charge to 4.24V just because you can, and don’t discharge below 3.7V ). Those, and managing the enclosure temperature are easy ways to protect and prolong a Li-Po’s life.
wow, thanks for the many replies.
Maybe I wasn’t clear on that one, but in my second post (and in my mind) I assumed that I don’t want to power the modem below 3.6V (for example), just run measurements and power some devices with average (and max.) current of around 16mA.
So, might a possible scenario be:
Would that be considered “nice to the battery”?
Regarding the solar panel:
We’re currently having a really nice summer here in Germany and in the past days I had a few measurements for the 3.5W panel with > 400 mA. All-day average for the past 10 days is 50 mA.
50% & 25% looks like a great starting point…you can always modify based on your field results.
A side note to Solar Panel Sizing: Personally (just my opinion), I use 2 different design scenarios for every project.
Summer Design Day - mainly concerned about the maximum temperature inside the enclosure for Li-Po Protection. You’ll have Plenty of Sun for the Power Budget.
Winter Design Day - mainly concerned with Power Budget and minimum enclosure temperature for Li-Po Protection. Much Less Solar Energy to harvest. That’s why I size the Panel to recharge 24hours worth of Energy within 2-3 hours. Battery Capacity saves you when you get a week of cloudy winter weather. Sleep Code is executed after that.
Sorry for the Rambling…
Just for comparison of „sun conditions“: where is your device located @Rftop?
Just re-read one of your first posts where you said that idle current is around 50mA in idle. You‘re not talking NETWORK_STANDBY, right?
I see below 10mA while sleeping with modem in standby
No Sir, I get <6 mA measured @ the Li-Po when Sleeping with NETWORK_STANDBY, depending on Battery Voltage.
I’m in the Southeastern USA
But because of the sun & ambient temps, I’ll have enclosure temps up to 130 degrees Fahrenheit (55 C) if I don’t do things to prevent it. Lessons learned the hard way.
As quoted in this post (see image)
50mA apply when the device is running, but cellular modem is off and no external circuitry draws power from the device.
As ScrufR pointed out (and his clip of the datasheet) , the 50mA was assuming you needed the Electron awake normally to collect sensor data.
I “assumed” this from your large Li-Po size.
If by chance what you are measuring (sensor) can drive an interrupt, then your 24hr baseline demand could start closer to 6mA * 24hr using something like System.sleep(D2, RISING, sleepLeft, SLEEP_NETWORK_STANDBY );
Similar situation if your data collection can operate on a timed schedule.
Verses 50mA * 24hr to constantly collect sensor data w/ Electron Idle, modem Off
Again, External demands not included as ScruffR clarified.