@RWB, @ScruffR, @BDub
I haven’t been running my Electron for a little while (lots going on, but also working with a colleague testing an Electron in Brazil). In any case, I fired mine back up… and my data usage is totally out of whack. I started with my most recent code that pushes 142 bytes (according to getDataUsage()). However, pushing every 10 minutes for 24 hours showed 0.11 MB. I expected more in the ballpark of 20-25 kB. Somehow I’m using 5 times that. I’m deep sleeping between sessions, but am running 0.6.0-rc.2, so there shouldn’t be any reconnection overhead. Edit: I was also running this with NO_ACK.
I reverted to much simpler code to check my usage. getDataUsage() indicated about 102 bytes per session. After 25 hours, console showed I had used 0.12 MB, but I expected closer to 16 kB.
Here’s the simpler code:
I feel pretty dumb now. Any idea what I’ve forgotten and caused my usage to be so high?
Edit: P.S. In case it’s important. I had been using a 3rd party SIM when I started my tests again. I noticed that data usage was unusually high, which is why I reverted to my Particle SIM for the tests presented here.
I’m pushing the same data to Ubidots every 2 mins using the code below and the data rates are lower than yours I’m pretty sure but haven’s tested exact data consumption.
Setup a free Ubidots account and give this a try since it should give you lower data overhead,
SYSTEM_MODE(SEMI_AUTOMATIC);
SYSTEM_THREAD(ENABLED);
// This #include statement was automatically added by the Particle IDE.
#include "Ubidots/Ubidots.h"
#define TOKEN "123456789" // Put here your Ubidots TOKEN
#define DATA_SOURCE_NAME "ElectronSleepNew"
Ubidots ubidots(TOKEN); // A data source with particle name will be created in your Ubidots account
int button = D0;
int ledPin = D7; // LED connected to D1
int sleepInterval = 60;
void setup() {
//Serial.begin(115200);
pinMode(button, INPUT_PULLDOWN); // sets pin as input
pinMode(ledPin, OUTPUT); // sets pin as output
ubidots.setDatasourceName(DATA_SOURCE_NAME);
PMIC pmic;
//set charging current to 1024mA (512 + 512 offset)
pmic.setChargeCurrent(0,0,1,0,0,0);
pmic.setInputVoltageLimit(4840);
}
void loop() {
FuelGauge fuel;
if(fuel.getSoC() > 20)
{
float value1 = fuel.getVCell();
float value2 = fuel.getSoC();
ubidots.add("Volts", value1); // Change for your variable name
ubidots.add("SOC", value2);
Cellular.connect();
Cellular.ready();
ubidots.sendAll();;
digitalWrite(ledPin, HIGH); // sets the LED on
delay(250); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
delay(250); // waits for a second
digitalWrite(ledPin, HIGH); // sets the LED on
delay(250); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
System.sleep(D0, RISING,sleepInterval * 2, SLEEP_NETWORK_STANDBY);
}
else
{
Cellular.on();
delay(10000);
Cellular.command("AT+CPWROFF\r\n");
delay(2000);
//FuelGauge().sleep();
//delay(2000);
digitalWrite(ledPin, HIGH); // sets the LED on
delay(150); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
delay(150); // waits for a second
digitalWrite(ledPin, HIGH); // sets the LED on
delay(150); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
delay(150);
digitalWrite(ledPin, HIGH); // sets the LED on
delay(150); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
delay(150); // waits for a second
digitalWrite(ledPin, HIGH); // sets the LED on
delay(150); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
System.sleep(SLEEP_MODE_DEEP, 3600);
}
}
I’m confused because I’ve already been down the road of testing AUTO and SEMI_AUTO and Sleep_Network_Standby. I had been using a 3rd party SIM and I settled on AUTO and non-Standby because data usage seemed to be about the same for me with and without Standby, and SEMI_AUTO caused problems for my 3rd party SIM. I’d like to sleep the radio for ultra-low power if possible and hadn’t seen major issues with overhead when doing so.
In one test on a 3rd party SIM with more overhead, I had been pushing battery values every 20 min and was on track to use 670 KB per month at a 20 minute publish rate (based on carrier data usage numbers). In a second test, I was on track for 1.4 MB per month with a 10 minute rate.
Firing it up again this week, it’s as if something if different on the carrier or cloud side that’s blown up usage on both the 3rd party and Particle SIMs. I’ll keep testing and at least try to replicate @RWB. Thanks!
@bioagbob Try this customized code based on your example code you provided.
SYSTEM_MODE(SEMI_AUTOMATIC);
SYSTEM_THREAD(ENABLED);
int button = D0;
int ledPin = D7; // LED connected to D1
int sleepInterval = 60;
void setup()
{
//Serial.begin(115200);
pinMode(button, INPUT_PULLDOWN); // sets pin as input
pinMode(ledPin, OUTPUT); // sets pin as output
}
void loop()
{
FuelGauge fuel;
float value;
bool success;
if(fuel.getSoC() > 20)
{
Particle.connect();
value = fuel.getVCell();
String output = "{\"batt-value\": \"" + String(value) + "\"}";
success = Particle.publish("fuel-level1", output);
digitalWrite(ledPin, HIGH); // sets the LED on
delay(250); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
delay(250); // waits for a second
digitalWrite(ledPin, HIGH); // sets the LED on
delay(250); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
System.sleep(D0, RISING,sleepInterval * 2, SLEEP_NETWORK_STANDBY);
}
else
{
//Put Cellullar Modem into Deep Sleep until battery charged above 20%
Cellular.on();
delay(10000);
Cellular.command("AT+CPWROFF\r\n");
delay(2000);
//FuelGauge().sleep();
//delay(2000);
digitalWrite(ledPin, HIGH); // sets the LED on
delay(150); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
delay(150); // waits for a second
digitalWrite(ledPin, HIGH); // sets the LED on
delay(150); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
delay(150);
digitalWrite(ledPin, HIGH); // sets the LED on
delay(150); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
delay(150); // waits for a second
digitalWrite(ledPin, HIGH); // sets the LED on
delay(150); // waits for a second
digitalWrite(ledPin, LOW); // sets the LED off
System.sleep(SLEEP_MODE_DEEP, 3600); //Wake Up Every Hour to check if battery is over 20% yet. Can do this for 20 Days with no charge input before depleting battery.
}
}
Thanks for the code @RWB. Very kind of you.
I watched the Electron and console log for the last hour while running your code above exactly as written. I’m still going to run code longer and am waiting for console to update with data usage, but wanted to post observations so far. The publishes are not reliable.
There’s a clear pattern in the interval between publish events and failures. There’s a sequence of 3 events that repeats:
i) t = 0, fuel-level1, breathe cyan for 60 sec, sleep;
ii) t = 3 min, fuel-level1, breath cyan for 3 sec, sleep;
iii) t = 5.2 min, Came Online (no fuel-level1), cyan blinks 3 sec, rapid flash 5 sec, sleep;
iv) repeat cycle 2 min later.
Is the 60 sec breathing cyan waiting for an ACK?
Why does device ‘come online’ every 3rd cycle instead of publishing properly?
The reason we’re running semi-automatic is because we do not want the device to automatically connect to the cloud once the battery SOC is below 20% to save on battery usage during the 1 hour wake-up periods.
I was using the System Treading when sending data to Ubidots without the need for Particle Cloud Access but since your using the Particle cloud you can remove it, which may eliminate the problem you’re seeing also.
Will try now. Curiously, how much data do you use sending to Ubidots compared to Particle (difference in overhead)? In another thread you said “In my test, I was sending 5 min updates using a Particle Publish event for an almost 200+ byte data string using NO-ACK and it ended up using 1.68 MB per month or 30 days.”
Hmm… except I’m publishing every 2 minutes. It seems that since yours worked at 1.1 sec, I should be okay at 2 min. I guess I can try a longer interval anyway. I can rule nothing out at this point.
Yes, but the exact same behaviour with a full connect handshake after a fix count of wakes was seen by another member and I was able to confirm that and back then I reported the similarity between that and the fix count of “rapid” publishes issue to Particle.
As far as I remember adding a delay(5000) between the publish and sleep instructions did help getting around the issue.
Could you try that?
The minimal wait with immediate sleep may be just that tad too quick for a reliable reconnect on next wake as the expected 1000ms was that bit too short.
I’ll try adding the delay after I wait for my console to reach steady state (Electron turned off).
I ran RWB’s code (part with and part without threads) for about 3 hours over which there were about 70 fuel-level or ‘online’ connections. The console started at 0.28 MB (had not run all night before this test) and is now at 0.40 MB (about 2 hours after Electron turned off). So, 0.12 MB is about 1.7 kB per connection.
When I was Particle Publishing and going to sleep I was seeing the Electron flash Green or Cyan for up to 60-90 seconds before it would actually go to sleep for some reason. Never tried the 5-second delay before calling sleep I don't think. It would be nice if it fixed the issue though.
Testing now with 5 sec delay before sleep. I may have done something similar way back, but can’t recall now since things had been working okay (My most recent code did have a 1 sec delay to allow the serial port to flush before sleep, so perhaps that helped - but doesn’t explain why I started seeing problems now).
Console billing was stable at 0.40 MB at start of test. Updated code (with delay 5) took about 6 minutes before it reached a repeatable pattern of publishes. First 6 minutes: ONLINE, pin reset, update, fuel: breathing cyan for 60 sec, fuel: [missed flashes], ONLINE. Since then it’s been publishing every 2.1 minutes, each time breathing cyan for about 7-9 seconds. So, I’d say that fixed (worked around) the re-connection bug.
Will let it run to see what happens with data usage.
Thanks @BDub. Glad to know there’s a fix. How soon is “soon”?
And while you’re here, what are the odds of getting better tools to track carrier usage charges (per session or hour or anything timestamped!)? Monthly Electron Data Usage