Subscribing to an event within a Spark Core does not work

Hello,

I have 3 Spark Cores doing different things and they all seem to work just fine. On some of my Cores I’m publishing events like this:

Spark.publish("motion-detected/1", NULL, 60, PRIVATE);

On another Core I want to subscribe to this event like this:

Spark.subscribe("motion-detected", motionDetected, MY_DEVICES);

However it is not working - I never receive these events on any core.

I’ve also tried different event-names to no avail …

The events themselves are triggered correctly as I can see on

https://api.spark.io/v1/devices/events/?access_token=XXXXXXXXXXXXXXXXXXX

But regardless which core I choose, the event never gets there … They are connected to the cloud (at least I can call functions and get variables from them) …

Any ideas?

The event you publish must be exactly the event you are subscribing to,but in your code this is not the case.
The extra /1 has to be part of the subscription event, too.

If you want to know which Core has published the event you either put this info into the event payload or you pull the DeviceID out the events standard info.

I’ve already tried this - but according to the documentation it should work anyway:

http://docs.spark.io/firmware/#spark-subscribe

A subscription works like a prefix filter. If you subscribe to “foo”, you will receive any event whose name begins with “foo”, including “foo”, “fool”, “foobar”, and “food/indian/sweet-curry-beans”.

I see, I must have missed that :blush:

Just out of curiosity. Have you tried a public event instead and a “public” subscribe?

Can you show some more of your code? Especially the event handler.

And just to be sure, all your cores are registered to the same Spark account?

@mdma: Yes, they are registered and flashed from the same account …

@ScruffR:
I’ve already tried this and it did not work …

@McTristan, you have not tried all of it, yet :wink:

@McTristan, have you tried without the /1 in the event name so both event name and subscribe name are exactly the same?

@peekay123:

yes I already have …

@ScruffR:

void motionDetected(const char *event, const char *data)
{
    int randomNumber = rand() % 3;
 
  // the following line just plays a sound - this method works in a loop without a problem ...
    wtv020sd16p.asyncPlayVoice(randomNumber + 1); 
}

Strange - with a public event it seems to work now, though it wasn’t working before … MY_DEVICES or PRIVATE does not seem to work though …

Sometimes the Universe gangs up against you, but once you call for help, it doesn’t dare to anymore :wink:

Bare this in mind and retry things from time2time.

Glad things are working to a certain extent at least :+1:

But I guess a motion detector is nothing for a public cloud :confused:

Agreed, but now you could start narrowing things down, and maybe file an issue, once you have nailed down the culprit.


Just to be on the safe side, always try to incorporate some means of checking action that doesn’t involve “black box code” which might not work with some other “black box code” (e.g. subscribe handler invocation vs. wtv020sd16p).
I’d either do a D7 LED blink or a Serial.print().

Well I’ve written the library myself so I know what it does but in general I agree of course :wink:

Assuming everything flashed okay, I’ve got it working with PRIVATE.
First Core:

Spark.publish("color_values",pubstring,PRIVATE);

Second Core:

void setup(){
    Spark.subscribe("color_values", myHandler);
}
void myHandler(const char *event, const char *data)
{
    Spark.publish("pubdata",pubstring, PRIVATE);      
}

I’m watching my events with a Node.js script. Once I fire a function on the first Core, two events are returned, indicating the subscribe works.

Well I do believe it has to work for quite a lot of other people otherwise the community would be full with complains … for me, my setup, my account or whatever it is - it does only work for public events :confused:

“spark subscribe mine” lists all my private events :confused:

Have you performed the deep update/CC3000 patch, if applicable? A factory reset might also help.

Factory reset - yes - multiple times on different cores …

I’m not sure about the patch you mentioned - is there a how to somewhere?

Yep, over here: http://docs.spark.io/troubleshooting/#deep-update

I’ve done it now on one of my cores - however it does not receive private events. Public events do work.