Is there Function call throttling?

I have a ‘protocol’ that uses function results 0 as a success.

But it appears that if I am doing testing with many function calls in succession I will start getting

{ id: '...',
  last_app: '',
  connected: true,
  return_value: 0 }

Even though the device is not performing as if the function had run. I swear I added publish logging to every possible place that could return 0… maybeeee I missed something

Usually I just wait a while and then I can run the test again with full success using many bursty function calls still.

Is it possible that I am being throttled somehow, and the return_value defaults to 0 ? If this is the case I really wish there was another parameter in this result telling me I am being throttled, or at least know about the throttling limits.

one thing you forgot is that the publish has a 1 change per second. which would be your throttling condition.

Function calls aren’t limited as far as I’m aware. Neither are variable calls. Server Sent Events, and thus webhooks do have their limitations. Like @tezza mentioned, there’s a rate limit on publishes of 1p/s, with bursts of up to 4p/s allowed, after which you get a cooldown until you’re back under your limit.


Didn’t think i was wrong on that point @Moors7, i came across that limit myself and worked out what it was doing.
@cdeadlock not sure if there are any web applications that will allow you to display more than a couple of changes per second. If you want it faster than that you might think about putting up your own virtual machine in the cloud to process the data.

The publish limit is actually in the docs.
Depending on the amount of requests you’re trying to make, you might be going too quickly. Keep in mind that the request has to go halfway across the globe, depending on where you’re located. Light can only travel so fast, and it’ll take some time to process things. For the fastest possible connection, a local cloud in your network is perhaps the best solution, although you could do direct TCP/IP as well.

Are you seeing the return value of 0 all the time, or just once in a while?

I am aware of the publish throttling of 1 per second. And my debugging code specifically accounts for that by blocking to a max of 1 second messages. And in this case i disabled all logging except for when I was to return a 0 to a function call (which never happened at all) I am sure this is not related…

So i am currently down a bit of a rabbit hole trying to figure out whats going on. I changed my code to never ever return 0

const int vartotal = 10

int adminfunc(String cmd){

  int toreturn = admindisp->dispatch(cmd);
  if (toreturn == 0){
    log->log("getclaim","admin 0");
    return (1 << vartotal);
  return toreturn;

void setup(){
  Particle.function("admin", adminfunc);

I was still getting zero result codes, so something wasn’t recompiling properly… I renamed by main file from .ino to .cpp
and then it complained because I had my adminfunc() below the setup() function. Which shouldn’t have compiled for the last month or more??? I actually have no idea what an .ino file is or why it allows out of order function definitions. (or why code wasn’t recompiling when I changed files, I am using the Atom editor on ubuntu linux)

Also I am dealing with some bad wifi connection at the moment and waiting for a better antenna for my router… So I can not continue my tests for a couple days. (which may explain my current magenta? (red blue?) breathing now…

Sorry to be confusing. To sum it up: I will soon remove all chances of a 0 return code to try and prove whats going on here.

For a while I could run a series of tests 10 or more function calls in as quick succession as node/http/cloud allows. And everything worked as expected…

Then if I ran the same thing again, three calls (before my first assertion) would return_value:0 with no indication that my function was actually being run on the device.

Then I would just wait a while… assuming throttling… and then it would work again.

Just give me a couple days to remove all chances of a 0 return code and I can test this further.

That’s easy to answer :wink:
For .INO files the Particle preprocessor takes care of some things users new to C might not think of.
So it’ll add #include "application.h" and function prototypes as well as some other things.

All this is not done for .H and .CPP files, so you’d need to provide function prototypes and add application.hyourself.

If you add #pragma SPARK_NO_PREPROCESSOR as first line in your .INO, you’ll see the same behaviour as with .CPP.

Alright I have absolutely confirmed this behaviour. (Now with a different return code than 0)

There is some kind of time limited caching going on I think?

I have a test script in node.js using the javascript client.

The test entails calling many functions in quick succession, I have debug code that outputs a single publish when a function is run on the device.

So I can run the test and see the debug outputs:

{ id: '...',
  last_app: '',
  connected: true,
  return_value: 1024 }

{ id: '...',
  last_app: '',
  connected: true,
  return_value: -2 }

{ id: '...',
  last_app: '',
  connected: true,
  return_value: -2 }

...bunch of other calls, the last result code is 1024 success

and in another window the subscribe mine
{"name":"/log/debug/getconfirm","data":"getclaim ran","ttl":"60","published_at":"2015-09-30T21:41:23.578Z","coreid":"..."}
{"name":"/log/debug/getconfirm","data":"getclaim ran","ttl":"60","published_at":"2015-09-30T21:41:24.435Z","coreid":"..."}
{"name":"/log/debug/getconfirm","data":"getclaim ran","ttl":"60","published_at":"2015-09-30T21:41:26.339Z","coreid":"..."}

Now if I run the same test again immediately I will get

{ id: '...',
  last_app: '',
  connected: true,
  return_value: 1024 }
{ id: '...',
  last_app: '',
  connected: true,
  return_value: 1024 }
{ id: '...',
  last_app: '',
  connected: true,
  return_value: 1024 }

Which is impossible, there should be two -2 results after the first 1024.
And there are no debug outputs so I know my function has not run on the device, but the cloud is giving me some kind of “cached” return_value

I can reset the device or wait a while, and it will work properly again.

At first this feels like an http caching problem to me, but I am not entirely sure why resetting device would fix it. Or what is triggering the “cache” only responses at some point.

Are you running into the publish throttling limit of one per second on average with a burst of four allowed?

I don’t think there is a limit on function calls other than the time it takes to go from you to cloud to device and back.

As @bko has just pointed out and others already a while back

Particle.publish() is no good way to check timing things :wink:

Try a Serial.print() or LED blinking - that’ll better reflect what’s actually going on.

Yes, you’ve confirmed the well known publishing limit :wink:

1 Like

This isnt anything to do with the publishing limit. If you see in my example my debug messages are >1 second apart. They just prove that a function has run on the device.

Which isn’t happening, yet i still get a return_value.

Some more info: I am using the promises API of the javascript client

Definitely some weirdness going on
now I am seeing

{ id: '...',
  last_app: '',
  connected: false,
  return_value: 11520 }

Connected says false, but it is still returning valid return_values

I would assume this is related to Two cores breathe cyan, but "offline" [RESOLVED]

I am seeing this more and more seemingly: after programming sometimes it will breathe cyan even though there was no “online” event and it appears to stay disconnected until a reset.

I guess your observations must rather be due to response time and not actually an indication of any request limiting
Just for a quick test I flashed this to my Photon

int i;

void setup() 
      Particle.function("fn", fn);

void loop() 
    RGB.color(i & 0x01 ? 255 : 0
             ,i & 0x02 ? 255 : 0
             ,i & 0x04 ? 255 : 0

int fn(String cmd)
    i &= 0x07;
    return millis();    

And then tested with this web-page how quick I can change the RGB-LED by just clicking the respective function button as quick as possible and the reaction was definetly way faster than one or two per second.

Here are some timings (limited by the speed of my finger rather than the cloud ;-))


And this is from Central Europe to San Francisco and back via a HSDPA hotspot :wink:


All right I have confirmed this is all my own screwup: I had a cache that fills up and clears time based, and I was testing too fast.

Any other problems I had with bad return values was I think my Atom IDE would “compile in the cloud” to the latest firmware version although I think the firmware on the device was not the latest version? Regardless it seems my latest code changes were accurate after I was sure to update the firmware as well.

Or correct me if I am wrong and the ATOM IDE or cloud-compile knows what firmware version to compile to based on the selected hardware?

No it doesn’t. For the time being it builds against the latest firmware, but there are some efforts with an alpha version local build extension for Dev, where you can select the target version manually.