Subscribing to an event within a Spark Core does not work

Sorry if this seems like going in circles, but have you tried different combinations of “privat publish”, “public publish”, “subscribe all” and “subscribe MY_DEVICES”?
Which do work, which not?
Have you removed all (forgive the expression) “obscure” code from the event handler down to the absolute minimum (e.g. set a var that gets published in loop independently)?
Can you setup your Cores with some minimal publish/subscribe only firmware, to find out if it’s your project code or your Cores/account/…?

I’ve tried public / public -> works, public / private -> does not work, private / private -> does not work.

I’ve tried a very simple sketch which does nothing but light up the LED at D7 - but it does not work. On all three cores - only public / public seems to work.

The command “spark list” lists all devices correctly …

Sorry for my impertinence, but could you also try private publish -> subscribe all? Just to have it all tested :wink:

And could you also check if your Spark.subscribe() returns true to indicate a successful hooking up?

Nothing to be sorry about :wink:

Spark.subscribe() returns true in all cases - private publish / subscribe without filter doesn’t work either … :frowning:

I’m not sure if this applies to your problem, too.
But I have filed an “publish/subscribe” issue.

I stumbled about the same issue when trying the different scenarios-

Needless to say I never got private publish / private subscribe to work on any core but trying to subscribe to the same event, on the same core publishing the event, does not seem to work too.

In general, I realized that a lot of events published actually never call any action on the subscribing core (public publishing / public subscribing) - they work like 75% and the rest is kind of dropped.

I can see them on the api.spark.io website but different functions on different cores won’t be called - tried it with different cores, different events, different payload - it’s just not very reliable :confused:
I’m not publishing too many events or such (during my test I was sending like 3 events a minute) - nor are my cores busy with other things.

I might just leave it there - I was kind of hoping to build a publish/subscribe mesh of some kind between my cores and I really like the cloud idea but is has to work like every time :confused:

@mdma, @Dave, any comments on published vs subscribed dropouts? Could the dropouts be related to the 60s fixed TTL time? How often are published events asynchronously sent to subscribers as they are received or are they queued and sent at fixed intervals? :smile:

Are you sure your code is correct? I don’t think the overloads are right.

I think here:

Spark.publish("color_values",pubstring,PRIVATE);

You are actually doing

Spark.publish(const char *eventName, const char *data, int ttl);

Passing in the PRIVATE as the TTL (which doesnt do anything because TTL is not impelmented, according to the docs).

I just tested your code and it works. But as soon as you do:

Spark.publish(“color_values”,pubstring, 60, PRIVATE);

The events dont get published to another Core. The Spark-CLI works fine receiving the events.

I might be incorrect here.

2 Likes

@Carsten4207, GOOD CATCH!

I have read that bit over and over, but it never popped into my eye.

@McTristan, maybe this helps. Since PRIVATE as TTL would cause the event to vanish after one second, this might have contributed to your problems.
There might have been a warning according type safety, but we never looked :wink:

Try, as @Carsten4207 suggested

   Spark.publish("pubdata", pubstring, 60, PRIVATE);

instead

Edit: Sorry, as I went back to the top of this thread I saw that you've already used the correct overload anyway

Yeah what I was pointing out was that if you pass in PRIVATE correctly it doesn’t work.

1 Like

Hey All,

This has been bugging me too! So I pushed a fix for it today, Sorry about the delay! I think if you try private subscribe, it should now work as expected. :slight_smile:

Thanks,
David

6 Likes

Hey Dave,

seems to work now - great!!!

Thanks,

Sandro

P.S: I’ve also contributed some libraries to the Web IDE - the wtv020sd16p and the Keypad library

1 Like

Awesome, thanks! :slight_smile:

After testing it for a long while it seems, the cores which are subscribed to an event lose the subscription/connection after a while (usually hours - sometimes about half an hour). So it is (unfortunately) still not very reliable - it does, however, at least not matter whether the event is privately or publicly pushed :confused:

I know you guys can fix it - heck I’ve even (pre)ordered a couple of photon cores because I believe in you :wink:

It could be because the cloud connection was momentarily dropped and reconnected, since the subscription is lost if the cloud connection goes down. There’s a pending issue to fix this.

That would explain it, yes. Would be great if there would be a fix… I’m full of ideas and with 6 cores soon the fun really begins :wink:

Hey I am also having this issues It used to work but now it doesn’t is it because of the NOTE: A Core can register up to 4 event handlers. This means you can call Spark.subscribe() a maximum of 4 times; after that it will return false.
Any help is appreciated thanks in advance

Here is my code for the first core

  #include "application.h"
      #define zPin A2
  #define Dpin D1 // Input pin for the led to blink when its moved
    
    //Minium and Maxium data for the Z axes- the accerelometer 
    int zAccMin = 1722;
    int zAccMax = 2400;
    int sendLedPin = D1; // blinks when accelerometer is touched
    int zAccPin = zPin; // Value for reading the data for the Z axe
    int receiveLedPin = D2; //The pin that blinks when it recives.(Core-2 dependent LED)
    int sendLedVal = 0; //variable for keeping the state 
           
    void ledTwoToggle(const char *event, const char *onOff);
    
    void setup()
    {
        Serial.begin(9600);      // sets the serial port to 9600
        pinMode(zAccPin, INPUT);
        pinMode(receiveLedPin, OUTPUT);
        //digitalWrite(receiveLedPin, HIGH);//Ensure the local LED is set to off to begin with
        pinMode(sendLedPin, OUTPUT);   
        Spark.publish("motionDetected1", "State", 0, PRIVATE);
        Spark.subscribe("motionDetected2", ledTwoToggle, MY_DEVICES); // Set up Spark.subscribe();
        
    }
    
    void loop()
    {
        zAccPin = analogRead(zPin); // read the value of the Z axe sensor
        if(zAccPin == (zAccMin) || zAccPin < (zAccMax)){
             sendLedVal = !sendLedVal;
            digitalWrite(sendLedPin, sendLedVal ? HIGH : LOW); //write the appropriate HIGH/LOW to the sendLed pin to turn it ON/OFF
            //Serial.println("RED IS ON"); //Debugging 
            Spark.publish("motionDetected1", sendLedVal ? "ON" : "OFF"); //Publish the state to the Spark Cloud as ON/OFF
            delay(250); // Primitive  debouncing
           
        }
      
    }    
void ledTwoToggle(const char *event, const char *onOff){ //Handler function for Spark.subscribe ()
        Serial.println("onOffValue:");
        Serial.println(onOff);
        if (strcmp(onOff, "ON") ==0)  //if sendLed on Core2 is ON
           { 
             digitalWrite(receiveLedPin, HIGH); 
            } else if (strcmp(onOff, "OFF") ==0) 
            digitalWrite(receiveLedPin, LOW);
             }
                
            }

The code for core 2 looks like

#include "application.h"
      #define zPin A2
  #define Dpin D1 // Input pin for the led to blink when its moved
    
    //Minium and Maxium data for the Z axes- the accerelometer 
    int zAccMin = 1722;
    int zAccMax = 2400;
    int sendLedPin = D1; // blinks when accelerometer is touched
    int zAccPin = zPin; // Value for reading the data for the Z axe
    int receiveLedPin = D2; //The pin that blinks when it recives.(Core-1 dependent LED)
    int sendLedVal = 0; //variable for keeping the state 
           
    void ledTwoToggle(const char *event, const char *onOff);
    
    void setup()
    {
        Serial.begin(9600);      // sets the serial port to 9600
        pinMode(zAccPin, INPUT);
        pinMode(receiveLedPin, OUTPUT);
        //digitalWrite(receiveLedPin, HIGH);//Ensure the local LED is set to off to begin with
        pinMode(sendLedPin, OUTPUT);   
        Spark.publish("motionDetected2", "State", 0, PRIVATE);
        Spark.subscribe("motionDetected1", ledTwoToggle, MY_DEVICES); // Set up Spark.subscribe();
        
    }
    
    void loop()
    {
        zAccPin = analogRead(zPin); // read the value of the Z axe sensor
        if(zAccPin == (zAccMin) || zAccPin < (zAccMax)){
             sendLedVal = !sendLedVal;
            digitalWrite(sendLedPin, sendLedVal ? HIGH : LOW); //write the appropriate HIGH/LOW to the sendLed pin to turn it ON/OFF
            //Serial.println(WHITE IS ON"); //Debugging 
            Spark.publish("motionDetected2", sendLedVal ? "ON" : "OFF"); //Publish the state to the Spark Cloud as ON/OFF
            delay(250); // Primitive  debouncing
           
        }
      
    }    
void ledTwoToggle(const char *event, const char *onOff){ //Handler function for Spark.subscribe ()
        Serial.println("onOffValue:");
        Serial.println(onOff);
        if (strcmp(onOff, "ON") ==0)  //if sendLed on Core1 is ON
           { 
             digitalWrite(receiveLedPin, HIGH); 
            } else if (strcmp(onOff, "OFF") ==0) 
            digitalWrite(receiveLedPin, LOW);
             }
                
            }

I know this has been a while. But has the dropout issue been resolved?

This weekend I made one Spark Core listen to some events from another Core, and after a while (not sure duration) the listening Core doesnt receive the events anymore.

Any updates?

@Dave - any updates to this? Surely this is a really huge bug to squash? Or am I missing something? Doesn’t this bug’s existence cripple the whole spark subscribe/publish thing? I am trying to have a mini network of four particle cores talking to each other on a WiFi network and it’s just not going to work if a bit of network disruption causes them to ignore each other? Or is there another way to do it?

It took me a while to find the issue on github, sorry. It looks like this have been fixed in the HAL branch that is on Photon and is coming soon to Core:

1 Like