Unable to publish without consistent delay [SOLVED]

I’m trying to publish some temperature data. The REST way works, the publish has some troubles, here’s the code:

// missing init
void loop()
{
  Serial.println("Starting...");
  rawtemperature = analogRead(A7);
 
  temperature = (((rawtemperature * 3.3)/4095) - 0.5) * 100;
  
  sprintf(temp1, "'%.2f'", temperature);
  strcpy(temp2,"'temperature':");
  strcat(temp2, temp1);
  
  Spark.publish("Temperature",temp2);    
  Delay(100);
}

If my delay is greater than 200 ms the core “freezes” as described in all the cyan blink and cloud connection lost posts, REST GET is blocked sometimes and events are published irregularly.
If it’s smaller than 200 REST GET works fine, but no event is published.

Ideas? Am I missing something?

I’m not sure if this is what’s causing your problems, but there’s currently a limit of how many publishes can be made in a time frame. At the moment that number is 1 per second, with bursts of up to 4 per second allowed. I haven’t found this in the docs, which might explain why you haven’t been able to find it. (@bko, could you perhaps comfirm this/add this to the docs, it seems fairly relevant?)
Could you try an even greater relay, let’s say, 1000+?

1 Like

Hi @Barabba

Spark.publish() is rate-limited to an average of once per second with a burst of up to four allowed. Try changing your delay(100); to delay(1000);

1 Like

And what about delay(1000) causing cloud connection lost?

It might just be that the Cloud is kicking you out since you’re “spamming” it with publishes. If it’s limited to a maximum of four, and you’re going over that with your highest tolerance (1/200=5), then I guess it’s protecting itself. But that’s only a guess, I might be completely off. Just give it a try, and see what it does. It won’t hurt you :wink:
Depending on whether or not you actually require a resolution of 5 p/s, there are other options available (TCP springs to mind(?))

Hi @Barabba

delay() was modified a few months back to use the delay time to run the Spark cloud loop, so no worries there.

1 Like

Is there a way in the CLI to see what is being published?

For example

C:>spark list
Checking with the cloud…
Retrieving cores… (this might take a few seconds)
Spark Dev Unit A (**********************************) is online
Variables:
uptime (string)
ssid (string)
temperature (int32)
pressure (int32)
altitude (int32)

but no listing of published items.

Delay(1000) still hangs, Delay(2000) loses connection sometimes and has problems while publishing and getting REST requests simultaneously (408 HTTP). I’ll try for Delay(15000) to have consistent time to use REST API.

Update: 15 seconds delay locks Spark Core completely. Trying to access from atomiot or REST Client gives HTTP 408.
This is the 4th time I reset my device, it seems impossible to get publish and delay work together.

@Barabba
Are you doing anything else in your app? Also how and where did you declare temp1 & temp2?

You want subscribe:

http://docs.spark.io/cli/#running-from-source-advanced-spark-subscribe

Spark core doesn’t reply.
Note: I’ve a C# SSE subscriber working perfectly with public events and sometimes with private once.

@mtnscott The missing code:

double temperature = 0;
int rawtemperature = 0;

char temp1[64];
char temp2[64];

void setup()
{
    
  Serial.begin(9600);
  Serial.println("Starting...");
    
  // Register a Spark variable here
  Spark.variable("temperature", &temperature, DOUBLE);
  Spark.variable("rawtemp", &rawtemperature, INT);

  // Connect the temperature sensor to A7 and configure it
  // to be an input
  pinMode(A7, INPUT);
}

@bko - Thanks!

So, I would like to capture this data and generate reports. Something like this post - https://community.spark.io/t/example-logging-and-graphing-data-from-your-spark-core-using-google/2929 but using publish instead of a spark variable. Not sure if google drive can support push instead of pull events.

@Barabba Hmm… Everything looks fine. I have an app that is working with Spark.publish and delay(2000). It reads a lot of the network / cloud settings, displays them on a LCD display and tracks cloud disconnects and WiFi disconnects. So far in the past 50 mins I have had 11 cloud disconnects and 1 WiFi disconnect.

I have occasionally had 1 second hangs where my loop won’t execute - but I’m still trying to figure out what is going on when that happens. What version of the core are you using - Black or White - not sure if that makes a difference.

Oh - and it publishes the data every 15 seconds.

@mtnscott not too much disconnects.
But what about REST calls while publishing?
My core is black, the one coming from the starter kit (to be sure ;))

Hi @Barabba

By REST calls you mean reading Spark.variables() via the cloud API, right? I know that I can’t read variables over the cloud faster than about once per second without having failures. Part of that is that I am about 0.135 seconds round-trip from the cloud and so there are two round-trips for every variable: PC to cloud to Spark Core back to cloud back to PC.

How fast are you hitting the cloud API?

Hi @bko and thanks again.

I’m checking my variable once while publishing every 2 seconds and it returns a timeout message.
The HTTP GET runs 30 seconds. I check again after 5/6 seconds from the timeout to let everything keep a consistent state, but it fails again.

Now without publish and delay I’m able to receive a GET response every 1 second, I’m pretty sure there’s something blocking the cloud connection.

Hi @mtnscott,

Try:

spark subscribe mine

Thanks,
David

@Barabba have not tried to read the spark variables from the core at the same time. I have a few variables so I will run a test trying to read them every 15s as well

@Dave - yes, I have that running so I see the data when it arrives, just hoping I could leverage the Google datastore, drive, … and build some graphs and dashboards. Still researching if I need to write an intermediate app or if I can publish to Google directly.