I’ve been running into some issues with mangled data when trying to publish events from within an event handler. (My original idea was to attempt a sort of printf debugging/error reporting using events instead of using the Serial interface, at least for now.) As far as I can tell, this only happens when the name of the event being published is longer than the original received event name. For example, with the following program:
void signal1(const char *event, const char *data) {
Spark.publish("received_signal1", data, 60, PRIVATE);
}
void signal2(const char *event, const char *data) {
Spark.publish("rec_s2", data, 60, PRIVATE);
}
void setup() {
Spark.subscribe("signal1", signal1, MY_DEVICES);
Spark.subscribe("signal2", signal2, MY_DEVICES);
}
Here’s an example interaction, where I’ve sent the “signal2” event first, and then the “signal1” event:
{"name":"signal2","data":"foo","ttl":"60","published_at":"2015-08-14T15:55:59.562Z","coreid":"001"}
{"name":"rec_s2","data":"foo","ttl":"60","published_at":"2015-08-14T15:55:59.579Z","coreid":"53ff..."}
{"name":"signal1","data":"foo","ttl":"60","published_at":"2015-08-14T15:56:03.115Z","coreid":"001"}
{"name":"received_signal1","data":"d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_signal1�d_sig","ttl":"60","published_at":"2015-08-14T15:56:03.138Z","coreid":"53ff..."}
It looks like publishing a longer event name ends up overrunning some internal buffer, since the missing part of the event name in the bad output is exactly the length of the original event name. I’ve tried a factory reset and also tried it on another core just in case, but I’m still seeing this behavior. Any help would be greatly appreciated.