Spark.publish event will more expensive than Server push event(calling a core's function)?

hi Guys,

I am recently have a problem, Can you all help me analyze it where is the root cause?:

1: I prepared a function in Core. named openRelay:

void openRelay(pinNum){
    // open relay.
   return Spark.publish("message", "now the relay has been opend.");

2: I have a url like “http://a_cloud_url/v1/devices/123456/openRelay” for calling that function.

3: In my Apps, i have a SSE client which connected to the local server , and also listen to the “message” event

4: I called that url rapidlly ,Probaly, 5 times/seconds,

finally , the result i found none of the url calls is failed, But the events(named 'message ') sent from the core will sometimes failed to send to the server side(I confirmed that it is not received the message in Spark Local Server).

For above senario , my guess is : The Url call for that function (openRelay) is SSE mechnism which is not expensive , but the Spark.publish is send by Normal http request? which will open the http socket again and again . that is the root cause of fail ? Does my analyze reasonble ?

I’m not sure if you should use the return for publishing.
Could you try this code, and watch it over a serial monitor? It will publish the message with both SSE as well as Serial, so you should be able to check if the function has been called and/or the publish has arrived.

int i = 0;

void setup() {
	// Open Serial connection
    // Register the Spark function
    Spark.function("test", test);

void loop() {

int test(String text){
	// Print text over serial
	Serial.println("Spark function has just been called!");

	// Blink green LED 3 times
	for (i = 0; i < 3; i++){
	// Publish event
    Spark.publish("test", "Test event has just been published.");

How do you call your function remotely?

If you just put it into your browser address bar it won't be triggered.
There are lots of discussions about this misconception that you can "surf" any URI by just typing it into the address bar of a browser.
The point is that remote Spark.function calls need to be POSTed but the browser produces a GET request.

As @Moors7 suggests, for testing always add some functionality (e.g. blink LED) into your functions to get visual feedback if they actually get called correctly.

If you want to trigger a Spark.publish you should obey the 'max. 1 per second' limit.

1 Like

He says he's using the local server, so let's go from there:

Although it's true that the publish rate is limited to 1 per second, that only hold true for the Spark Cloud. The local cloud is limited to whatever rate you set it to. Having not yet played around with that, I'm going to assume that the rate limiting is still implemented in the Local Cloud, and has to be manually disabled. Knowing that @kennethlimcp has quite a lot of experience with the Local Cloud, perhaps he can shed some light on this? Let's tag @Dave as well, that never hurts :wink:

Like @ScruffR mentioned, you need to perform a POST request to trigger a function, so could you perhaps elaborate how you do that using this:


What 'Apps' would that be, and how do you make it 'listen'?

I would recommend listening to the public firehose on a local :cloud:. (to be honest the local :cloud: does not have the idea of multiple segregated account etc.)

There’s some bug with the SSE on the local :cloud: so for now, simply listen for message on the public firehose: http://DOMAIN_OR_IP/v1/events

Spark functions should really be treated like ISRs and should contain a minimal amount of code and return quickly. I would suggest that you simply set a flag in your spark function and then check for that flag and handle it in your loop()

Hi Guys

Thx u all , the fast reply, I am appreciate and learn more from you .

For my confuse ,that is a little complex to test , it depend on how fast calling the url. but everything is fine if call that url slowly.

@Moor7, I just followed your method and tested ,now sure the Spark Core will lose publish message, (Loop function will blocks sometimes when the message got lose ) , For my Apps is a Ajax-based apps, which compliance the standard SSE API .

Hi @ScruffR, It’s sure for Post url for calling the function . and could you please explain how to set the publish rate limitation in Local Server ?

Hi @kennethlimcp, Does the Spark protocol in Local Server will be same as the Spark Cloud ?

@harrisonhjones, could you please give me more detailed info ?

I'd have to hand this question on to someone using the local server, like @kennethlimcp - I'm not the guy for these topics :wink:

1.) Spark-protocol and Spark-server is the stripped down basic version but similar to the actual Spark cloud.

There’s more to consider with thousands of core connecting, build farm etc. Sothere’s definitely some differences but COAP protocol and API implementation is the same.

2.) I don’t think there is a rate limit or should I say there might be a setting thattthat we can change for publishing. There’s some throttle both in firmware and cloud so that’s tricky.

will have to take a look at the code to find the setting though

What I mean is something like this (note, this is psuedo code):

bool sprkFnCalled = false;


    if(sprkFnCalled == true)
        sprkFnCalled = false;

void sprkFunction(String str)
    sprkFnCalled = true;

oh, I understand you, @harrisonhjones, I will try it. B.T.W,can i simply add delay(1000) instead of your method ?

A delay where? Try to keep delays out of Spark functions. They can, reasonably, be anywhere you want in loop()

Hi @kennethlimcp,

According to your experiences, how many connections the protocol server will be able to take care ?

Protocol server mean Spark-Protocol (Tcp Server), Not the Local Server.

Hi @yuanetking,

The coap protocol / spark-protocol / local server can easily handle hundreds of connections or more. :slight_smile:


@Dave , Wow, you so cool !