Burst of publishing data on local Cloud

I am running a local version of the cloud, where I have two Cores connected. Each Core has a sensor that I would like to be able to publish data to the server in a really high frequency (100ms for example).

I have seen in some posts that Spark.publish() has a limit to 4 per second.
Since I am running on my own server, is it possible to remove that limitation so that I can sample the sensor at the frequency I want?

Hi @alexstyl, it should be possible if you look at the source code. I’ve been looking at this same questions on and off but never had the time to fully investigate this.

The pipeEvents function and, generally, EventViews001.js could be a good starting point:

Let me know if you find it!

Hi @alexstyl,

Good question! The event rate limiting exists in two places, one is on the cloud, and the other is in the firmware. The firmware’s rate limiting is slightly more aggressive, but if you’re working locally you can certainly disable it.

If you’re using cores, then you’ll want to look in 0.3.4, the last firmware released for the cores, you can see some of the rate limiting code in the supporting core-communication-lib library here:

  if (now - recent_event_ticks[evt_tick_idx] < 1000)
    // exceeded allowable burst of 4 events per second
    return false;


1 Like

Hi @Dave, thanks for providing the exact place where the burst logic is done in firmware.

Do changes made to spark_protocol.cpp get compiled within user firmware, or is it part of system firmware?

Also, where would the event limiting be applied on the cloud-side? I’m still looking at EventViews001.js and
api_v1.js but couldn’t find where the server-side throttling is happening.

p.s. I found a related thread discussing this as well and it was helpful in explaining how the firmware throttling is done:

It looks like modifying spark_protocol.cpp did the trick to adjust the burst logic on local clouds.

I had to do a one-time re-flash of the system firmware on the Photon for this to work – rebuilding and recompiling user firmware alone was not enough.

1 Like

Hi @chuank,

Totally, the location on the photon is slightly different, since it’d be using the newer HAL setup, but it sounds like you found it. It’s been a little while since I was in the local server codebase, but if I remember correctly, I left out the rate limiting on the local server for exactly this scenario :slight_smile:


1 Like

Just to clarify for posterity, spark_protocol.cpp can be found . . .

for Core in /core-communications-lib/src

for Photon in communication/src

or, if on a Mac, just type spark_protocol.cpp into Spotlight!

Thanks for keeping the thread alive as we transition to Photon.


1 Like

Hi @chuank,

I am currently searching for doing this but I can’t. Can you explain what you’ve changed to make it works please. I’m a beginner so I don’t understand exactly what you have to change here. The cloud and the firmware of the photon. How do you made the one-time re-flash of the system firmware ?

Best reguards,

Jordan Assayah

Hi @JordanAssayah

First thing is this: you need to run your own local cloud before updating your Photon’s system firmware to disable rate limiting – Particle’s cloud implements its own rate limiting on the server side, plus it’s not too nice to attempt that on the cloud anyway :smile:

Local cloud instructions can be found here:

To disable rate limiting on your local cloud, you’ll need to download the firmware repository locally and modify the source code. Look at spark_protocol.cpp and comment out the rate-limiting code:


Finally, system firmware update instructions are outlined here:

Hello @chuank !

Sorry for my late response ! Thank you very much for explaining me how to do the trick :smile:

It helps me a lot !

Best regards,

Jordan Assayah

1 Like