Boron LTE w/ Solar- Trials

Without getting bogged down with the actual data, I wanted to share my excitement about the preliminary results from Boron LTE’s on Solar. Keep in mind, no Stop or Sleep Modes are available in 0.8.0-rc.25. These are just a baseline (Online 24/7, Publish every 5 minutes) for later comparison once we have low power modes.

I’m running 2 field trials:

  1. Boron LTE, 2-watt Solar Panel, 2,000 mAH Li-Po (same that ships with Electron)
  2. Boron LTE, 1-watt Solar Panel, 5,000 mAH Li-Po

As mentioned, Both Borons are constantly Online and publishing SoC and Li-Po Voltage every 5 Minutes. Solar Panels use the USB connector.

They spend ~16 Hours through the night & early morning without any Recharging from the Solar Panels.
The 2,000 mAH Li-Po “reported” SoC drops ~26% over those 16 hours.
The 5,000 mAH Li-Po “reported” SoC drops ~12% over those 16 hours.
The Li-Po’s are recharged with 1-2 Hours of Sun.
No scheduled publishes are missed, they are very reliable at my test location.

When the various Stop & Sleep modes become available, the Boron LTE will make a great stand-alone Field Device on Solar. We can use smaller Batteries and Solar Panels than required for the Electrons.

A 1-watt Solar Panel isn’t any larger than the smallest weatherproof enclosure I’d normally use for the IoT device itself.

Question: I tried using in setup(), but that didn’t lower the measured current at all. Any ideas or suggestions there ?


I wonder how the Boron LTE compares to the E series LTE’s power consumption.

Seems like the E series LTE would be the way to go if you want the low power consumption and all the sleep modes now. I have not tried the E series LTE modules.

A 1w solar panel can be very small so it’s good to hear a solar cell that small is harvesting 16 hours of no sleep run time in just 1 to 2 hours of sun.

By your numbers it looks like the Boron is consuming 500mah over 16 hours of run time.

That’s 32 mAh per hour.

Are you seeing the data consumption numbers in the console for the 2 Borons your testing?

I haven’t tested the E-Series either.
I’ve been happy with the Electron Performance, especially after the WakeOn RI_UC was discovered/developed by the guys on this post. OTA Firmware Flash, function calls, Particle.Variable Requests, Subscribed Events, etc, can wake the Sleeping Electron @ 5-6 mA. Most folks wouldn’t consider that “Low-Power”, but it’s impressive considering Low Power wasn’t a design goal for the Electron.

That’s one of the reasons why I’m expecting the Boron LTE will shine as the system firmware evolves, since it’s built around low-power hardware. ~25mA isn’t a bad starting point for Cloud Connected and 5 minute Publishes.

As you know, there are numerous ways to perform the Boron measurements.
When supping 5.00V CV to the USB connector, I’m seeing 20mA normally.
Short spikes during Cellular/Cloud background housekeeping.
A publish takes about 8 seconds (@ my location) at ~90mA.

Cellular Data usage isn’t showing up in the Console for the 2 test Borons.
But I’m running the same code that I’ve used on Electrons w/ 5 minute publishes, and those never came close to reaching the monthly data included w/ service. I use (NO_ACK) publish for a ThingSpeak Webhook:

    snprintf(msg, sizeof(msg),  "{\"1\":\"%.0f\", \"2\":\"%.2f\", \"k\":\"%s\"}", SOC, lipoVoltage,  myWriteAPIKey)  ;
    // Send the publish
    Particle.publish(eventName, msg, PRIVATE, NO_ACK);

Which produces this event in the console:

{"1":"90", "2":"4.11", "k":"XXXXXXXXXXXXXXXX"}

I’ve tested the ThingSpeak Library and also MQTT, but the method above uses the lowest cellular data by far (per my testing w/ Electrons).

I forgot to mention in the original post that I use pmic.setChargeVoltage(4208); in setup() on both Borons. But it appears the charger IC won’t use any form of CV mode in it’s charging profile. Charging is suspended at the hint of reaching terminating voltage (w/ my test conditions anyway). Charging is resumed when Li-po drops to 4.11V - 4.12V. There is room for improvement to maximize energy harvesting. I’d love suggestions or guidance from anyone on the charging. I understand Li-On Safety and general Practices.

1 Like

You should pick up a E-Series LTE module and see how close the power consumption numbers are and verify the WakeOn RI_UC is possible on the LTE modules.

If I remember correctly this RI pin is not connected on the Boron LTE modules but @rickkas7 can verify that easily.

The only way I can see you saving on Data consumption is by using the Particle Losant integration where you could simply send your data like this:

snprintf(msg, sizeof(msg),  "{90:4.11}") ;
    // Send the publish
    Particle.publish(eventName, msg, PRIVATE, NO_ACK);

That would cut your data consumption down by 200-300% vs how your doing it now. You simply separate your values with “:” 's in between each value and describe what each value is on the back end of Losant which then saves it as a variable value on each received event.

The charging process your seeing on the Boron is the exact same as with the Electron, its the same charging chip so basically you will see it reach that 4.1 volts and then stop charging until the voltage drops back down and the charging starts back up to 4.1v again. I never saw a problem with this method of charging since usually, you don’t want to hold the battery voltage at 4.2v all day long if it reaches fully charged early in the day and the sun is out all day long.

The nRF52 mesh devices don’t connect RI_UC like the STM32F2xx devices do. This isn’t as bad as it sounds, because the nRF52 can wake on serial, so there’s no need to have a dedicated wake-up pin. However, since the mesh devices don’t support sleep yet, that’s sort of theoretical thing.

1 Like

I didn't mean to imply that I have any concerns with cellular data usage. I love Particle's Pricing Structure.

I know, that's the unfortunate part. It didn't bother me too much with the Electron..... but now we're moving forward with truly Low-powered hardware.

The TI chips claims "charges the battery in three phases: pre-conditioning, constant current and constant voltage." However, the reason I'm not seeing CV may be caused by the Solar Panel, if the Panel's output is fluctuating too much (clouds, etc). I may try to lower the setTermChargeCurrent(); to something very small to give the IC a chance to correctly charge the Li-Po at the end of the cycle (in CV mode).

There's nothing wrong with the Li-Po being at 4.21V for 8 hours in a Solar installation with a proper design.
However, I agree with Particle's reasoning on the "default" Vfinal being lower.
I myself will use a Vfinal slightly lower than 4.21V when installed at a real project (just for increased Life), but we only have 2 choices in the current Boron documentation right now.

The B series SoM (system-on-a-module) does not include the PMIC on the module. You’re free to incorporate a different PMIC that’s better suited for solar when that module becomes available next year. That combined with the lower power usage of the nRF52840 should be really helpful with lowering power requirements.


Acturally there are negative effects on the battery :battery: when you hold the charge voltage at the high end longer than needed. See below.

1 Like

The Boron docs show setPreChargeCurrent() & setTermChargeCurrent(), both are held in one 8-bit register REG03.

As a test, I’d like to set both the PreCharge and Termination Currents to the minimum available, which appears to be the 128 mA offset. I’m investigating 1-watt Solar Panel performance for Low-Power Applications. This might give the IC a better chance to run in CV mode on a 1-watt Panel.

Is pmic.setTermChargeCurrent(0,0,0,0) and pmic.setPreChargeCurrent(0,0,0,0) correct ?

Are you setting the minimum input voltage limit to prevent the solar panel from operating to far below it’s Maximum Power Point Voltage (MPP) to get the most current from the panel? This will help get more power from the panel during all weather conditions.

It’s a the poor mans MPPT (Multiple Power Point Tracking) the the larger more advanced solar charge controllers?

@RWB, I agree..... But a few hours is not "long term storage".
There are always trade-offs. Just like the 10% capacity reduction for every 70mV reduction in charge Voltage from your post. After the safety aspects, it's up to each designer to pick where to land in regards to performance vs expected life. This tends to sway when dealing with low power & energy harvesting and per project. As I said, I'll select a lower Vfinal once that's available in the Firmware. (2 choices documented right now).

A few hours is one thing, 8 hours is another and will work against what your trying to accomplish.

Were talking about holding the battery voltage during charging.

The battery voltage will naturally fall and settle when you stop charging the battery so it will not remain at the 4.2v while sitting idle for very long.

Your not really gaining anything by holding the battery voltage higher all day considering your not discharging by more than 20% per 16 hours of battery run time.

You actually loose cycle life by holding the battery voltage higher for many hours per day. You don't really need that extra capacity you may gain by charging and holding the battery at a higher voltage for longer.

That's just my view on it based on the data I have.

Worst case is you do what you want and have to replace the batteries more often.

No Sir, not yet. My thoughts were even the 1-watt panel gets the Li-Po back to Vfinal in an hour.
But it's not performing the CV function (with small Solar Panel), which is where a decent portion of the energy is transferred.

So real world numbers:
Li-po reaches 4.19V and Solar charging stops. But as we all know, that's not the actual Cell Voltage, that 4.19 is charging conditions. A few minutes later the Li-Po (no charging) is actually 4.15V, as expected.

I'm not trying to hold the Li-po Voltage to 4.2V all day, I'm just wanting to let the TI chip do what it's made to do, perform all 3 phases of charging. That's why Texas Instruments made this a 3-mode charger. I don't want to circumvent the TI charging profile at all.
I have 12 Years of experience with Li-ons, so I understand the battery much more than I do the hardware, firmware, or coding.

I appoligize if anyone thinks I'm complaining or splitting hairs.... I'm not.
The IC has plenty of functionality on-board, I'm just trying to figure out how to use it to maximize performance for low-power applications.......I guess that's the Professional Engineer in me coming out unfortunately.


I get it.

I think with their default charging profile along with setting the minimum charging input voltage to keep the solar panels voltage from going to far below its MPP and to prevent the solar cells voltage from being pulled down to just above the battery voltage is the best your going to get from the PMIC that’s built in.

Over and under temp charging paremeters are not enabled due to them using resistors to make the PMIC see a steady 25c in range temp reading. So charging above and below recommended temps can and will happen.

I just found this in the TI Datasheet: Charging Termination
The bq24195L, bq24195 terminates a charge cycle when the battery voltage is above recharge threshold, and
the current is below termination current.

With the “Default” setting of TermChargeCurrent = 256mA, a 1-watt panel won’t provide enough current to the IC as soon as it enters CV mode. Pesky datasheets always tend to hide the info I’m looking for in plain view.

I’ll unbox another Boron and try not to let the magic smoke out with:

pmic.setTermChargeCurrent(0,0,0,0) and pmic.setPreChargeCurrent(0,0,0,0)

Yea but with the minimum charge current being 128mA your probably not going to see much gain from the change.

Do you have any idea what kind of charge current the 1 watt panel is producing?

My guess is 5v x 200mA = 1w.

So via your normal 50% derating on solar the 200mA panel the battery is seeing around 100-150mA of charge current during perfect conditions.

Still it will charge till it hits 128ma of current vs 256ma which should provide some extra charge time.

I agree… the 1-watt panel looks like a stretch unless I “get-over” the CV charging :grin:
The $4, 1-watt panel works as the 1 week trial has already shown, but it’s not optimal.
But then again, it should (in theory) meet 128 mA, once I change the TermChargeCurrent().
But I’ve got a whole box of various size Solar Panels to play with if it doesn’t.

Thanks for the reply’s too, I’ll keep ya updated.

Here is the code I used to keep the solar panel from operating below the set voltage.

pmic.setInputVoltageLimit(4840); //Set the lowest input voltage to 4.84 volts. This keeps my 5v solar panel from operating below 4.84 volts

Thanks !
I found Some Info: For my tiny little 1-watt panel, the Max Power Point is 170mA @ 5.5V .
I may have the Boron setInputVoltageLimit() only when the Li-Po approaches the target voltage, to give the IC the best chance at the 3'rd mode. I "think" that lowering the InputVoltageLimit (in a low-power application) may only be beneficial as the Li-po approaches the target voltage, otherwise I'm missing out on Solar Harvesting.

Heck, now I'm even more impressed at the Boron Solar Trials w/ the 1-watt and 2-watt panels this past week, only changing pmic.setChargeVoltage(4208) so far.

We're going to be dangerous when the Boron Sleep modes get here :wink:

You're wrong about this.

Right now because the solar panel is not capable of supplying more current than your max charge current setting the PMIC is pulling down the voltage of the solar panel to just above the batteries voltage.

If you measure the solar panels input voltage during charging this will become clear.

The only time the solar panels operating voltage will be in the 5.5v or above is when the load on the solar panel is less than it can supply during that moment's lighting and temperature conditions.

So if the solar panels output is 100mA at any given time and the PMIC is set to charge at 500mA or 1A the PMIC is going to try to pull as much current from the panel as possible and will basically pull the solar input voltage down to just above the battery voltage which will be in the 3.5-4v range.

When the panel is in the 3.5-4v range it will be operating away from the 5.5v MPP voltage where it produces the maximum current output. The further the voltage of the solar panel drifts from 5.5v the less efficient it operates and the less current it will provide which slows down your charging process.

So when you set the PMIC's minimum input voltage to 4.84v it keeps the solar panel operating closer to its peak 5.5v output voltage which will provide more charging current compared to not using it where the voltage of the solar panel will be in the 3.5-4v range.

Ideally, we would set the minimum input voltage to 5.5v but 4.84v is the highest setting we have with this PMIC so that's why I picked that.

This input voltage limit feature is meant for USB charging and the voltage drop that happens with junky or long USB cables but it's also useful for charging from 6v solar panels.

So without using the Set Input Voltage setting, you can measure the solar panels operating voltage. Then enable the 4.84v charging input limit and you should see the solar panel operating at 4.84v which will be higher than without using it.

The large systems we build use MPPT Solar Charge Controllers and they scan the solar panels voltage range from to 0v to the no-load VOC voltage and finds at what voltage does the solar array provide the maximum charging current.

Once the Vmp is found the solar charge controller locks the solar arrays voltage at that voltage because that is where the maximum charging current is harvested. The charge controller does this constantly all day long and provides 30% more harvest charging power vs systems that use PWM charing as the PMIC does.

Does any of that make sense?

The bulk of your recharge energy gets dumped into the battery during the bulk charging phase where it's in a constant current mode, not the Constant Voltage phase where the charging current starts to drop off and the charging comes to an end.