Batch Multiple Location Updates Into Single Publish

Hello. I am using a handful of Particle Tracker One devices for a data collection project. My application benefits from relatively high frequency data, collecting location samples once per second during vehicle movement.

Collecting at this frequency uses up a lot of data ops. I would like to be able to batch together a number of location samples on device and then publish them together. For example, if I could batch together 5 samples on the device, I could reduce data op usage by 80%.

I'm using the Tracker Edge firmware but looking at the source code I'm a bit lost as to how to approach modifying it to achieve the above. Can anyone recommend how I might approach this?

Thank you!

While it is possible to batch together multiple events into a single publish, you can only do so for sending the data to your own server and database. The loc event sent by the Tracker One can only contain a single event. Even at 5 seconds you will be using an enormous number of data operations unless you are moving very infrequently.

Hi @rickkas7
Yes, even at 5 seconds it is a lot of data ops. Our project is short term (for R&D) and we have pre-calculated the amount of ops we can accommodate within our data plan.
Is your recommendation that I create something from scratch then rather than modify the loc event?

I think you may have to make something custom. Also, the loc event is very large, and most of the fields will not be changing that frequently. I would create a new event that contains batched GNSS location only. Unfortunately you won't be able to rely on the Particle location database; you'll have to use a webhook to take the combined publish and forward it to your own location storage system.

Thanks for the reply. For forwarding and storage, I've got a webhook into Google Cloud. I agree, the loc event is big, and collating only locations is a good idea.

One of the attractive features of the built in location publish is that it uses the motion detection settings from the fleet wide config to trigger publishes.

I'm not very familiar with the MotionService API, but if I was to make a custom function, is there any way I can trigger it in the same manner as the loc event? ie. Use the motionService to trigger a function to record the location into an array once a second, then publish the array once it's 'full' (5 samples) from the main programme loop.

What I would do is take advantage of the existing IMU feature to wake and trigger publish. Increase the rate of publishes some while in this state, but not to 1 per second.

You should be able to detect when you are in a recent motion state, and then just generate additional GNSS location only events at a higher rate, with multiple locations aggregated into a single publish.

Also you might want to verify that the GNSS is programmed for high data rate updates. I can't remember for sure if the default is to update the location every second or every 10 seconds.

Thanks @rickkas7. This idea works well.

My program now runs like this:

  • I have the Fleet Config set to publish location updates no faster than once every 5 seconds
  • I have modified the locationCallback to flip a 'moving' flag
  • In the program loop I add new location data to a cache every second
  • When the cache is full I publish it using cloud event publish.

This works well for the first publish. My C++ skills are limited though, and something about the way I have it set up is resulting in subsequent publishes appearing blank.

I'm not sure where the issue is. I've tried using memset to zero out the cache after each publish but that hasn't helped. My code is below.

 #include "Particle.h"
 #include "tracker_config.h"
 #include "tracker.h"
 #include "bmi160.h" // Add the IMU

 SYSTEM_MODE(SEMI_AUTOMATIC);
 
 #if TRACKER_PRODUCT_NEEDED
 PRODUCT_ID(TRACKER_PRODUCT_ID);
 #endif // TRACKER_PRODUCT_NEEDED
 PRODUCT_VERSION(1);
 
 STARTUP(
     Tracker::startup();
 );
 
 SerialLogHandler logHandler(115200, LOG_LEVEL_TRACE, {
     { "app.gps.nmea", LOG_LEVEL_INFO },
     { "app.gps.ubx",  LOG_LEVEL_INFO },
     { "ncp.at", LOG_LEVEL_INFO },
     { "net.ppp.client", LOG_LEVEL_INFO },
 });



const int CACHE_SIZE = 5;

int cacheIndex = 0;

int timer = 0;

bool MOVING = FALSE;

char cache[1024];
JSONBufferWriter json(cache, sizeof(cache) -1);

CloudEvent event;

// Forward declarations
void locationCallback(JSONWriter &writer, LocationPoint &point, const void *context);
void cacheLocation(JSONBufferWriter &jsonWriter);

void setup() {

    Tracker::instance().location.regLocGenCallback(locationCallback);

    Tracker::instance().init();
}
 
void loop() {

    Tracker::instance().loop();

    // If we're moving and 1 second has elapsed
    if (MOVING && (millis() - timer >= 1000)) {

        // If we're caching for the first time
        if (cacheIndex == 0) {

            // Clear the cache
            memset(cache, 0, sizeof(cache));

            // Re-init the JSON Writer
            JSONBufferWriter json(cache, sizeof(cache) -1);

        }

        // Call the cache location function
        cacheLocation(json);

        // Increment the cache index
        cacheIndex++;

        // Set the timer to now
        timer = millis();

    }


    // If the cache is full
    if (cacheIndex >= CACHE_SIZE) {

        // Publish the cache
        event.name("loc-cache");
        event.data(cache);
        Particle.publish(event);

        // Reset the cache index
        cacheIndex = 0;

        // Reset the moving flag
        MOVING = FALSE;

    }

}
 
 // New function
 void locationCallback(JSONWriter &writer, LocationPoint &point, const void *context) {
 
    // Set the moving flag true
    MOVING = TRUE;

    // Set the timer to now
    timer = millis();

     Bmi160Accelerometer data;
 
     int ret = BMI160.getAccelerometer(data);

     if (ret == SYSTEM_ERROR_NONE) {
         writer.name("xAcc").value(data.x,3);
         writer.name("yAcc").value(data.y,3);
         writer.name("zAcc").value(data.z,3);
     }
     
 }

 void cacheLocation(JSONBufferWriter &jsonWriter) {

    // The high frequency location data is a limited dataset to conserve data allowance
    // It will contain time, lon, lat, alt, spd, hd, xAcc, yAcc, zAcc
    // Are time and ID contained in the publish by default?

    // Gather current location information and status
    LocationPoint gpsLock;
    Tracker::instance().locationService.getLocation(gpsLock);

    // Set GPS lock flag
    bool locked = TRUE; //gpsLock.locked;

    if (cacheIndex == 0) { jsonWriter.beginObject(); };

    if (locked) {

        String name = "loc";
        name += cacheIndex;

        jsonWriter.name(name).beginObject();

        jsonWriter.name("time").value((unsigned int) gpsLock.epochTime);
        jsonWriter.name("lat").value(gpsLock.latitude, 6);
        jsonWriter.name("lon").value(gpsLock.longitude, 6);

        jsonWriter.name("alt").value(gpsLock.altitude, 3);
        jsonWriter.name("hd").value(gpsLock.heading, 2);
        jsonWriter.name("spd").value(gpsLock.speed, 2);

        Bmi160Accelerometer data;
 
        int ret = BMI160.getAccelerometer(data);
        
        if (ret == SYSTEM_ERROR_NONE) {
            jsonWriter.name("xAcc").value(data.x,3);
            jsonWriter.name("yAcc").value(data.y,3);
            jsonWriter.name("zAcc").value(data.z,3);
        }

        jsonWriter.endObject();

    }

    if (cacheIndex == (CACHE_SIZE-1)) { jsonWriter.endObject(); };

 }

I think the problem is this:

// Re-init the JSON Writer
JSONBufferWriter json(cache, sizeof(cache) -1);

doesn't reinit, it creates a new local variable of the same name, which is then disposed of.

This might work instead:

json = JSONBufferWriter(cache, sizeof(cache) -1);

This now works! Thank you for your help.