Curious to find out how many subscriptions/cores people are running simultaneously. Background is that in the short term we’re looking at having up to a couple hundred cores publishing events frequently (with the actual interval TBD). Some quick stress testing makes it seem like there’s some upper limit on the cloud’s ability to handle published events and corresponding subscriptions, so was curious to see if anyone’s run into this and how they’ve handled it.
Just to be sure; Are you aware of the publishing limit of 1p/s, with bursts up to 4p/s allowed? This might be what’s causing the issues.
2 Likes
You can watch the public events streaming at: https://api.spark.io/v1/events?access_token=XXXXXXX
There’s a huge ton of data posted there every second and i believe it’s performing great.
1 Like
No, I wasn’t aware–thanks!
Well, now you are
Part of this can be overcome by using the Local Cloud, as this will enable to to customize/disable the publish limit. The subscribe function however, has, unfortunately, not yet been implemented. So for the time being you’ll have to either live with the rate limiting, or use a solution of your own.
1 Like