Sending data every second over Cellular

Hey, I am working on a project that uses CT sensors to measure current and send them to the cloud, with our hardware we are monitoring single phase and three phase currents. Ideally we would want to capture their electrical signature so we are looking at sending data every second, But @chipmc suggested that this is too much data for cellular. Data is being sent to @Ubidots over webhook.

It can be sampling a maximum of six sensors at once and then send the data every second.
If we can not send data every second, then what else could be done. If we look at preprocessing the data on the edge, then what kind of preprocessing would it be ?

Using FFT to create the waveform and then send only the spikes in the data, is the only approach I came across till now.

Another idea that I had in mind was storing the data in buffer and then bursting it out to the cloud every minute, but in that case the webhook limit was exceeding if I sent timestamp and sensor values for 6 devices.


@hannanmustajab ,

Thank you for posting this and I hope that you will find someone with experience in this area willing to chime in.


I have had the pleasure of working with Hannan for the past two years on various projects in India and Rwanda. He is a quick learner and will put these devices to immediate use helping folks in Rwanda better manage their electric consumption.

To help illustrate what is needed, I found this graphic on Wikipedia which shows what may be needed to characterize - over time and with some training - the drivers for electric demand.

Any help would be appreciated.



1 Like

@hannanmustajab, for clarity I want to restate your overall goal to make sure we are on the same page.

Capture the current signature of up to 6 pieces of equipment and log the data in the cloud (Ubidots).

Once the data is in the cloud, please elaborate on goal of what you are trying to do with the data. For example, determine the top 3 types of equipment with large overall power usage, etc.

To capture the signal I can think of two methods of capture: time based and change based.

Time Based
Time based (as you have proposed) is more robust and simple technique from the standpoint that is is logging through brute force and grabbing data and sending it regardless of how valuable the data is for your post processing goal. This method plays well with a WIFI based module and ample cloud storage. I assume WIFI is out of the question, which is fine but means this is likely a less than ideal capture method as soon as you migrate to cellular because of the cost of cellular data.

Change Based (Edge Computing)
Change based will be a bit more complicated, but could be the saving grace to managing cellular usage. With change based you are using the microprocessor to do the heavy lifting of determining if data has earned the right to be sent to the cloud. Now the microprocessor is evaluating the current every second (or faster!) and comparing it to a previous know value to see if the change is great enough to log. It will be best if your microprocessor is mains powered so you don’t have to bother with sleep or trying to optimize power consumption which will greatly reduce complexity.

I recommend you consider a change based capture method. It might look something like this pseudo code:

newct_raw = abs(
if newct_raw <= oldct_low or newct_raw >= oldct_high{
oldct_low = newct_raw * 0.8
oldct_high = ewct_raw * 1.2

In this example each reading is compared against a lower and upper boundary (20% low and 20% high in this sample) to see if the reading has changed enough to get logged. You can adjust your boundary percentages to get enough sensitivity, but balance this with reducing cellular consumption for little tiny changes that don’t tell much of a story.

Food for thought!

1 Like

+1 to @Backpacker87 Change Based method. There’s no need in sending duplicate data to the cloud. The graphs will be very similar anyway.
This takes (2) publishes for each change to draw the graphs correctly.

To increase the accuracy, you can add a voltage reference and locally calculate/accumulate the kWh and update that to the cloud on a schedule w/ a tiny increase in Cellular bandwidth.

@hannanmustajab is also may be worth taking a look at this product. It may not be practical for your project, but if anything could serve as some inspiration.

I suggest this solution: Split Single-phase Energy Meter | Crowd Supply
You can then focus on optimizing data transformation (accumulation, summarization) to save on cellular data.
I’m using this module myself, with a Photon, to upload usage and other statistics every 2 seconds to a timeseries database, but it’s in my home, connected to WiFi, so I’m not doing anything to reduce data usage.

1 Like

Hey, Thanks for replying; Our goal at the moment is to collect time series data and store it somewhere in the cloud ( aws or influx ) for training models for the next phase of the project. For now, the goals are to determine the downtime of each equipment, detect any anomalies in the data and calculate operational costs of each machine.

I have been working on this project for over an year now and in the previous version, I sampled every second and if the difference of current (A) was more than a specified threshold then it sent data to the cloud. So for example the if the current reading is 1.2 amps then it won’t send any new value until the next reading crossed 1.4amps. So this looks more like the change based method that you suggested. This is how the data looked:

@Rftop Hey thanks for your suggestion, I had a question in mind; If we use the change based method, would we be still able to capture the electrical signature of the devices or we’ll be compromising on it.

Another method that I was thinking was to store the data on an SD card locally over SPI and then save it as a CSV and at the end of the day send that csv file to aws and run a lambda function there to add the data. This way we work in real time with the change based approach and also are able to collect all the required data for the ML part.

This is what I think, I’d love to know your views on this.

@jaafar Hey, thanks for the response. I am also using the same method to send data every 2 seconds over wifi, but with cellular sending that much data could be a problem.


Sounds like you’ve gotten a lot of good advice here - as an FYI, I also had a similar need to send data every second to the Particle Cloud (just a short string value) and when I tried doing so with a Boron LTE hit my 5mb limit within 60-90 minutes.

Yes & No :wink:.
For example, this works fine for me when monitoring “larger” industrial loads, even with high inrush currents. Each CT/Circuit may have several devices, but each device has a particular inrush current and average current. It’s pretty easy to detect/assume which device started based on the Delta Amps.
I use an Array to keep a record of the running average for large pumps, which helps detect problems with the pumping system prior to failure. That works for most systems performing mechanical work.

Obviously this gets more difficult “if” your Use-Case is a residential home with dozens of small loads.
It just depends on your project & goals.

It looks like you are planning to use a machine learning based model to detect appliances turning on/off. If that is the case, I suggest to develop a Tensorflow lite model and run it directly on the Boron (some version of it has been ported in the past). And during the data collection phase, store the data in an SD card, as frequently as needed.

What about logging to a file then doing a periodic file upload say every 1min? (ofc server side needs to decode this)

This opens up the caability to do things like compression, time series averages, min/max for a period etc, while saving data and message overhead but still keeping the system simple


I agree, and hopefully our use case would be large industrial loads too. Can you please tell more about the tracking running average, that sounds interesting and I’d like to know more if possible.

@robthepyro I was also thinking about the same thing, We’ve installed a SD card which writes data over SPI. Is there a way to send the CSV file to cloud over HTTP directly from the device ?
Sorry if this sounds too obvious. Thanks

It’s not something I’ve done myself yet (but we are planning on doing such, especially to store a data queue especially during data outages)
That said I’m pretty sure it’s possible to do HTTP file upload, there are a couple of libraries around, I think a thread or two on this forum about it.

There are many ways, such as arrays, circular buffers, etc. Your project goals would determine the complexity required.

Most of the large industrial loads that I monitor all have dedicated CT’s for the Load, so that makes life simpler as it eliminates the need to predict which device started. My basic system tracks the # of motor Starts/Stops per 24 hours, average operating current, Min/Max, work performed, etc. That data is uploaded once a day and then most is dumped from the Particle device. The same data can be pushed real time at the end of each pump cycle for a Cloud HMI.

If you plan on accurately tracking energy consumption, you will also need a High Voltage Reference (AC:AC) to calculate Power Factor. But if your goal is Preventive Maintenance (PM’s), then the CT’s will be fine.