API limits for Spark Cloud

Im wondering if there is any limits to number of API calls per hour to the Spark cloud.

Lets say I want to log data from a set of sensors connected to Cores. Can I call 20 Cores every 5 seconds and log the output data without hitting any API limits?

Also I see that documentations mentions Server sent events as a upcoming feature. Any time estimates on this?

Thanks for the great question @sjunnesson!

Right now API rate limiting is not at the top of the priority list, but we’ll add it soon. We haven’t yet determined what the limits will be. In determining them, we will look for a balance between a few factors: the overall load on the Cloud as more people get their Cores, our shifting estimates of ongoing bandwidth costs, and our commitment to keeping the user experience absolutely as high quality as we can.

As you alluded, the right way to get sensor data from a Core is with events—either by opening an SSE stream or registering callbacks to your servers as events are generated (both features in our backlog). Events are on our high priority list, but they didn’t get added to the current sprint, which goes through December 20th. I expect there’s a good chance they’ll at least be started in the the sprint after that (12/23–1/3) and probably fleshed out and debugged in the following one (1/6–1/17).

So my estimate: In early January you’ll have some kind of access to events generated by your Cores, and by mid-January they’ll be more stable and full-featured.

Thoughts?

I think it sounds like a sensible plan and I completely understand that there is tons of things for you to do and you have to prioritize.

Me hammering the API with request is obviously not a desired path for development of my application but knowing that I can do that for now, which it sounds like, and then in the near future switch over to a more sensible event based approach sounds great.

1 Like