How many Google Cloud Integrations can I have?

Hello. I have a few questions about my intended strategy for setting up Google Cloud Integrations in Particle Console.

First main question, is there a limit to how many integrations I can have?

Further explanation of intended application - I welcome feedback or advice on what I intend to do.

I will have many possible locations, each with a single Particle device. Each location will stream operational data, and will also receive control instructions triggered by Cloud Functions. There may eventually be 1000 or more locations.

To conserve cellular data, I want to make sure all control events are not sent to all devices, and similarly, if a given location sends sensor data, I want to make sure any acknowledgement packets are only sent back to the original device, and not all devices thus wasting cell data.

Also, I feel each location should have its own topic and associated integration, so we donā€™t have all 1000 locations publishing to the same topic (though Iā€™m willing to do this if it is the ā€˜correctā€™ approach and doesnā€™t incur negatives).

My idea is as follows:

Create two Google Cloud pub/sub topics for each device/location, one for incoming data (called ā€˜pub_dataā€™ or something similar) and one for outgoing control data (called ā€˜control_dataā€™ or similar).

I would then create two new matching integrations for each device/location. This of course will eventually result in up to 2000+ integrations inside the Particle Console. This sounds like it may cause problems later, unless this is totally okay to do. (I notice the Integrations window in Particle Console shows each Integration as a large icon, so this page may become very full with over 1000 of them).

Each device/location may have slightly different firmware as each location may be slightly custom. For this reason, I prefer to avoid adding them all as a ā€œproductā€ and rather, track them as individual devices that will be named by location number. This would allow us to remotely update a given location/device if a customer needs a change.

Any thoughts or advice on this approach very much appreciated.

Thanks!!

I would not use multiple integrations but use a cleaver technique to route the responses correctly so that each device can trigger the same integration but ensure that the response will only be delivered to the originally triggering device.

The simplest way I could think of would be an integration listening for a prefix only (e.g. kkeng_) while the individual device emits an event and subscribe to the response like this

const char evtPrefix[] = "kkeng_";
char evtNamed[sizeof(evtPrefix) + 24];
void setup() {
  char evt[100];
  snprintf(evtName, sizeof(evtName)     // compose a unique even name
          , "%s%s"
          , evtPrefix
          , (const char*)System.deviceID()
          );
  snprintf(evt, sizeof(evt), "hook-response/%s", evtName); 
  Particle.subscribe(evt, evtHandler);  // subscribe to hook-responses for only this device
}

void loop() {
  ...
  Particle.publish(evtName, yourData, PRIVATE);
  ...
}
1 Like

Thank you Sir! Thatā€™s a great idea.

This makes sense for the publish, but Iā€™m not sure how to handle sending instructions back to the hardware. To be clear, I intend to have the hardware periodically send data up to the cloud. At some other random time, the cloud may push a command back down to the device, but not necessarily in ā€œresponseā€ to the first command. This would be a separate new command generated by the cloud via a userā€™s interaction or a schedule or something. Iā€™m not sure how to handle that step.

I assume with your example, I would create a single Integration in Particle Cloud and I would name the ā€œEvent Nameā€ as ā€œkkeng_ā€. I would then select the Google Cloud Pub Sub Topic, and I assume I would leave the device as ā€œAnyā€

I would send my hardware to cloud data by using Particle.publish with PRIVATE as you have shown.

This should result in the message arriving in my Pub/Sub topic. I will include indicators in this data of the source location, so a Google Cloud Function can insert it in the correct location of the database.

Iā€™m not sure how I would format that back inside Pub/Sub so the Particle Integration will trigger, and further, so it will trigger and only push the data back to the specific device listening to that specific custom event name.

Also to clarify, if I implement this as suggested above, if I understand correctly, any response will only be sent via cellular to the intended device, correct? This will not be broadcast to all devices wasting data, correct?

Thanks!

Correct, the cloud will only send an event to the device(s) that have subscribed to it (where the subscribed filter prefix matches).

That raises the question how the cloud would select the desired device.

For (occasionally) sending data to a specific devices I'd rather go with Particle.function().

Thank you.

What is the best way for a Google Cloud Function to trigger a Particle.function()? It looks like I should have the Cloud Function call an HTTPS link to api.particle.io.

Ideally Iā€™d like to hand a JSON object back to the target hardware device, where the JSON will include a few commands it should carry out.

If you write your Google Cloud Function in Javascript (node.js) you can just use particle-api-js directly to call the function rather than directly accessing the REST API, though both will work.

2 Likes

Hey rickkas7 - can you give a quick snip of an example. Iā€™m not sure what you are referring to or how I would write that, but it sounds much easier.

When I write my Google Cloud function, at the top of that function, do I just use the example line in the Particle docs? I donā€™t know how to add the particle-api-js to the Cloud function. Also the example doc shows using a particle.login with a user name and password. Would I have to actually write my user name and password inside the Google Cloud Function? Iā€™d rather not do that, if I can use an access token or something that may be better.

Also, in this example from the docs, ā€œDEVICE_IDā€, can that be the name I have given the device, or does it have to be the longer actual ID string of the device?
var fnPr = particle.callFunction({ deviceId: 'DEVICE_ID', name: 'brew', argument: 'D0:HIGH', auth: token });

Just not sure how to put together what you are suggesting into an actual working Cloud Function.

Thanks!

Example 8 in this tutorial shows how to call a Particle API function from a Google Cloud Function.

It does publish, but youā€™d just substitute the function call instead of the publish.

Hey I think I have success!!! I spent a while studying your example. I have the following working code if it is useful to anyone else.

One more question thoughā€¦ I noticed in the docs for the ā€œParticle Device Cloud APIā€, in the ā€œAPI rate limitsā€ section, it says ā€œThere is an API rate limit of approximately 10 calls per second to api.particle.io from each public IP address.ā€

That sounds problematic. If a scheduler in the cloud were to kick off a process at 8:00am on 500 different target devices, then this limit would be exceeded. Would those extra commands get dropped or qued and sent eventually? Either way, I assume this is a problem. Also, this may theoretically combine with other calls from other Particle customers if they happen to be running a cloud function from the same server.

I liked your suggestion to instead use ā€œParticle API JSā€. I notice when testing, the API JS based version runs faster.

Does the Particle API JS have any rate limits like the Device Cloud API? Would this cause the same problem as above?

As an alternate, I consider making a pub/sub topic for each target device and a separate Particle integration to communicate with each target device. Then I can have each target device subscribe to their own pub/sub and the cloud can post outgoing commands in those topics, which I think would be totally safe, except we would eventually end up with a lot of integrations in the Particle console if we have 1000 units. Do you know of any hard limit to the integrations?

Anyway, here is the working code:

INDEX.JS


var Particle = require('particle-api-js');
var particle = new Particle();

const particleFunctionName = 'test';
const targetDevice = 'mydevicename';
const argToSend = 'It Works!';

const token = 'someapikey209xxx8xxx98xx34ba2';

/**
 * Responds to any HTTP request.
 *
 * @param {!express:Request} req HTTP request context.
 * @param {!express:Response} res HTTP response context.
 */
exports.particleFunction = (req, res) => {

  var fnPr = particle.callFunction({ 
    
    deviceId: targetDevice, 
    name: particleFunctionName, 
    argument: argToSend, 
    auth: token 
    
    });

  fnPr.then(
    function(data) {
      console.log('Function called succesfully:', data);
    }, function(err) {
      console.log('An error occurred:', err);
    });


  let message = req.query.message || req.body.message || 'Ran Function';
  res.status(200).send(message);
};

PACKAGE.JSON

{
  "name": "sample-http",
  "version": "0.0.1",
  "dependencies": {
   "particle-api-js": "^7.2.2"
  }
}

Code on Particle device:

// Debugging Setup Code
SerialLogHandler logHandler(LOG_LEVEL_INFO);    //this means Log.info, Log.warn, Log.error will be sent over USB serial virtual com port

// The on-board LED
int led = D7;

int test_data = 0;
char pub_string[30];

int showString(String command){
  Log.info(command);
  Log.info("showString was called");
  return 1;
}

void setup() {
  pinMode(led, OUTPUT);
  Particle.function("test", showString);
  Log.info("Device Start");
}

void loop() {
  // Turn the LED Off
  digitalWrite(led, HIGH);
  test_data++;
  sprintf(pub_string, "%d", test_data);

  // Publish an event to trigger the integration
  // Replace "my_event" with the event name you used when configuring the integration
  // Replace "test-data" with the real data you'd like to send to Google Cloud Platform
  Particle.publish("pub_data", pub_string, PRIVATE);
  Log.info(pub_string);
  // Wait for 3 seconds
  delay(3000);
  // Turn the LED off
  digitalWrite(led, LOW);
  delay(3000);
}

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.