I have two cubes subcribed to events of each other via Spark.subscribe(), and periodically (every 2min) publish events to each other via Spark.Publish(). They are on different networks.
The setup is working great for many hours, but after half a day or so one Spark is no longer processing the published events from the other. Both sparks are still connected (I thing a reconnect happens in between due to reconnect of the Internet upstream each night).
If I am subscribing to the same event from elsewhere (webpage) I CAN see that the events are still successfully published by both Sparks, even in the broken state after some hours of uptime.
I further narrowed it down by publishing a debug event each time my event handler is called, which does not happen.
Therefore I know that both Sparks are still connected to the cloud and both are publishing events but somehow lost the Spark.subscribe() subscription and my event handler is no longer called.
Is this a known problem?
Is there any thing necessary or available to periodically “refresh” a Spark.subscribe() subscription?
Could it be useful to periodically call Spark.subscribe() again with the same event?