Maintaining Publish/Subscribe connections

I have one core (the Controller) publishing light intensity levels every 5 secs to another (the Receiver) which then uses that data to modulate light output levels. THIS WORKS until I power cycle one of the cores, after which the connection does not appear to re-establish until (and I am not sure here) I reflash the Receiver, not even if I wait for a very long time - like hours.

Is there a sequencing or protocol requirement I need to take care of?

I can supply my code but I am not asking for debugging services. I just want to be reassured that it should be possible to power cycle any party to the exchange.

As the Receiver is mounted high up I cannot make a USB connection to it so have to use Spark.variable to diagnose. I can plug usb to the Controller and using serial.Print I can see that it is sending - I can also see that on the viewsource page:

event: brightness
data: {"data":"0","ttl":"60","published_at":"2015-03-18T20:11:53.991Z","coreid":"54ff71......"}

event: brightness
data: {"data":"43","ttl":"60","published_at":"2015-03-18T20:11:58.999Z","coreid":"54ff71......."}

etc

My thanks

Have you tried manually subscribing to the event stream (using the spark cli or http://docs.spark.io/api/#reading-data-from-a-core-events) to see if the events are being sent? Perhaps something is amiss there.

But yes, you should be able to power cycle either but perhaps you have discovered some bug which we can fix

1 Like

I can see the Publish is being sent as I showed earlier from the viewsource data.

view-source:https://api.spark.io/v1/devices/54f...../events/?access_token=......

The problem appears to be that it stops being received. I need to set up an alternative receiver I can get into and your api reference might be the answer but I am afraid I have no clue as to how to use the guidance.

I use curl through the terminal … then use by browser … then use spark cli for GET or ?

Sorry.

I’d recommend looking into using the Spark CLI (particularly this function: https://github.com/spark/spark-cli#spark-subscribe)

Yes, that confirms again the Publish is getting out. You can see it using the key “brightness”.

I will have to investigate my receiver code again tomorrow. Maybe it is in the string handling, though it obviously does work most times.

Hi @goingalong

There have been problems with subscriptions when the subscribing core sleeps since when it wakes up it needs to resubscribe right now due to a bug.

Are you sleeping the core that subscribes?

Is this behaviour possibly related to this thread ?

Interesting! I am not knowingly sleeping any core, but I am turning it off (removing power) as a test because my product needs to be able to survive that.

Also I have not yet left it running for a long period, so cannot say if it drops out of its own accord.

To get a better handle on what was happening I wrote two test programmes, one each Publisher and Reader. The Publisher sends an incrementing value every few seconds.The Reader is limited to flashing it’s onboard LED whenever it receives a new published value.

This setup has been running all day and recovers from powercycles at either end. In case there was a hardware problem with one of the Sparks I also reversed the data flow.

Consequently I conclude that the Publish/Subscribe is working fine. My problem seems to be elsewhere so I can concentrate on that.

Thanks for your responses.

3 Likes

Turns out that I still have a problem with this. My test setup consistently locks up overnight (in the UK). I think the lockup is on the Publisher side but further testing is necessary to confirm this and to see whether there is any fixed (absolute or elapsed) time when this occurs. My test code has two sides - Publisher - Reader.

Spark1:

//Publisher - sends an incrementing value, as a string, every few seconds.
using namespace std;
bool ready;
unsigned long last;
int pubcount=0;
char publishString[10];

void setup() {
    Serial.begin(9600);
    
 Spark.variable("pubcount", &pubcount, INT); 
}

void loop() {
if (millis() - last > 5000) {
      sprintf(publishString, "%d", pubcount);
        Spark.publish("pubcount", publishString);
        pubcount++;
        last = millis();
  }
}

Spark2:

//Reader - receives published incrementing value and indicates status on Spark LED:
//  Yellow = stuck inside handler. Red/Green toggle on each new message
// Red/Green also flicker to indicate main loop is still running.
unsigned long lasttime;
int inval=0;
int lastval;
int ledstate=0;
int subscribed=0; //not used
const int flashtime=500;

void pubcountHandler(const char *event, const char *data){
    
RGB.color(0, 0, 255); //flag blue to indicate handler has been entered
inval=atoi(data);
if (inval != lastval){ //change led colour if new value received
    if (ledstate == HIGH) ledstate = LOW;
    else ledstate = HIGH;
    }
    lastval=inval;
} //end handler

void setup() {
//    Serial.begin(9600);
RGB.control(true);

Spark.variable("inval", &inval, INT); //make received value available for inspection
    
subscribed=Spark.subscribe("pubcount", pubcountHandler, "54ff71066672524839501267");

}//end setup

void loop() {
    //change colour to indicate handler is running
    
    //flash led to indicate main loop still running
    
        if (ledstate == HIGH)  {
            if (millis() - lasttime > flashtime) {
             lasttime=millis();
             RGB.color(255, 0, 0);
            }
            else RGB.color(64,0,0);
         }
         else {
            if (millis() - lasttime > flashtime) {
             lasttime=millis();
             RGB.color(0,255,0);
            }
            else  RGB.color(0,64,0);
         }
}

I would like to diagnose further by logging data through my Terminal overnight but a problem with the registration of my cores is preventing that for now.

Meanwhile I would appreciate any comments.