RS232 data to cloud with encoding or compression

I have a set of sensors that output data on a RS232 port. I currently have my Particle Electron setup to listen to the Serial1 port, read data, then publish it to the particle cloud. In order to save bandwidth, I would like to encode or compress the data. Does anyone have any thoughts on this? Below is my code, and also sample data from my sensor bank…

    void setup()
void loop()
  if (Serial1.available())
    String value = Serial1.readStringUntil('\n');
    Spark.publish("RS232", value);

Here is the sample sensor data. The sensor will output the following line once per second:

Sensor Base 13,1,2016-12-16T04:19:37.3776272Z,22:19:41,38.885101,0.000000,-12.271728,4.136775,12.032262,-11.670198,2.966006,0.000000,659.422241,16.759348,-3827.473633,0.000000,0.000000,3.893688,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.004073,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,673.229980,0.000000,1328.050049,0.000000,292.260010,0.000000,1620.310059,10.000000,0.000000,2468.239990,0.000000,0.130000,7.829093,0.000000,0.000000,0.000000,0.129811,4.454684,0.000000,0.023872,4.454684,0.023872,1.000000,0.000000,91.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,20.000000,0.000000,0.000000,0.000000,0.129811,8.348372,0.023872,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000001,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,2468.239990,-15.000000,127.198952,-25.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000,0.000000

If you have control over the sensors you could transfer binary data which you could then Base85 encode like mentioned in this thread

But you also seem to have loads of decimal places, that just take up space and don’t add value to your readings. If you can cut down on them, that’ll be a start.

If you have no control over the source data, you might want to translate the incoming data on the Electron, encode that and then pass that on.
Or you just use a string compressor (e.g. <zlib.h>) and send that Base85 encoded again.

1 Like

Try not to use ‘spark’ anymore, and do consider the publish rate limit :wink:

1 Like

No, I do not have control over the sensors or the specific data stream they are outputting. I will need to make any modifications to the data on the Electron itself, and not the sensor bank. Unfortunately, all the 0’s and decimal places are required. Once the system is activated, I need precision like that.

The one thing that I can do is only transmit data once every 30 seconds. The sensor bank is outputting this stream once per second. But, if I put a delay in the code, and only read the sensor every 30 sec, that will be much less to deal with…

I will look into the zlib and Base85 encoding this afternoon. Thanks for the advice!

Could you clarify what you mean by not using Spark any more? I am a little confused there…

Also, I have decided to only publish once every 30 seconds…

Thanks for your input!

He means use Particle instead of Spark, as in Particle.publish. Spark is depreciated; it still seems to work, but it might not in the future.


Excellent! Thank you!