Publishing Events Only Works for 3 Events then Stops

Hi Folks,

I’m new to Spark and am having the following issue. I want to publish an event from Spark, listen to events indefinitely (ie. run curl and each event published will appear for as long as I leave curl running). The problem is that curl is only printing 3 events and then going silent. Furethermore, it only prints these 3 events when I run the curl command, leave it running, and then flash using the cloud IDE. When I still leave curl on and power cycle the device, it again prints 3 and only 3 events, which means that curl is working fine. Any idea what’s happening here? Any idea how to fix it? Thanks in advance!

On the Spark IDE, I have:

void setup() {
  Spark.publish("temperature", "inside_setup", 0, PRIVATE);
}

void loop() {
    Spark.publish("temperature", "inside_loop", 0, PRIVATE);
    delay(100);
}

On Terminal, I have:

curl -H "Authorization: Bearer <my_access_token>" https://api.spark.io/v1/devices/events/temperature

Curl outputs (Side note: I’ve edited the ouput to truncate the coreid to “…687” for security…is it unsecure to post my device id publicly?):

event: temperature
data: {"data":"inside_setup","ttl":"60","published_at":"2015-03-07T23:42:42.344Z","coreid":"...687"}

event: temperature
data: {"data":"inside_loop","ttl":"60","published_at":"2015-03-07T23:42:42.348Z","coreid":"...687"}

event: temperature
data: {"data":"inside_loop","ttl":"60","published_at":"2015-03-07T23:42:42.479Z","coreid":"...687"}

… then it goes quiet and no more events are picked up unless I power cycle the device, then the same 3 events are printed…

There are limits to how fast you can publish to avoid overloading the cloud. One per second on average with a burst of four is allowed. If you change your delay to 1000 for one second it should work fine.

2 Likes

Thanks bko, but It still only publishes 3 events even with the 1000 delay :confounded:

Update: a delay of 1500 works, but 1000 does not work… :confused: I feel like I need to run gradient descent on the Spark API to find the decision boundary for the update delay… :unamused:

void setup() {
  Spark.publish("temperature", "inside_setup", 0, PRIVATE);
}

void loop() {
    delay(1500); // must be 1000
    Spark.publish("temperature", "inside_loop", 0, PRIVATE);
}

Is that timing limit written anywhere in the docs…if so, where can I read more? If not, why might it be missing and what other gotchas should I expect? Thanks so much!

Hi @DaveBot,

@bko is right on the limits, I suspect you’re hitting three because your core also publishes an extra event on startup, the “spark/cc3000-patch-version” event. :slight_smile:

Thanks!
David

Doesn’t it publish two on startup, the ‘online’ and patch version?

Strictly speaking, the cloud publishes the ‘online’ message, so it isn’t counted against the rate limit :slight_smile:

1 Like

Thanks guys! Is that timing limit written anywhere in the docs…if so, where can I read more? If not, why might it be missing and what other similar gotchas should I expect? Thanks so much!

Hmm, I was expecting to see this on the spark.publish docs for firmware, but I didn’t see it, I’ll open an issue for it.

new issue here: https://github.com/spark/docs/issues/304

Thanks,
David

2 Likes