[Solved] Event name not correct in Spark.subcribe() handler function

I’m sending some events back and forth between two Cores. Publishing and subscribing to events work and the subscribe handler function is called but the const char* event pointer seems to not point at the event name.

To test this I wrote a handler function for the Spark.subscribe() like this:

void eventHandler(const char* event, const char* data){
   Spark.publish(event, data);
   
   //Other stuff, witch is executed without problem
}

It publishes the event name and the data back to the cloud and then does other stuff not important to this problem.
This function runs on one of the cores. The other sends an event witch triggers this function an the result is this:

I was expecting to get the exact same event name and data back, but clearly this is not the case. I have also tried to compare the event name in code and lighting a LED if the match to make sure this is not an error from the publish() method.

Does anyone know what the issue might be? Is there something wrong with my code or is something broken?

Could you share your complete code for both devices, that’d make it easier to find where the problem might be?

Sure! It’s quite long, but you will find the important parts I’m sure.

The application is for monitoring that a machine is still running and then calling for assistance. Base monitors the machine and the node is like a pager sort of.

Code for the node:


#include "messages.h"

const char* watching = "HandlerOne";

const int ACKButtonPin = D0;
const int alarmPin = D1;

volatile bool hasRecievedACK;
volatile bool hasRecievedHelpSignal;

typedef void (*state_function)();
state_function next_state;

void setup(){
  
  Spark.subscribe(m_helpNeeded, recieveHelpNeededSignal);
  Spark.subscribe(m_helpNeededACK, recieveHelpNeededACK);
  Spark.subscribe(m_helpNeededLocalACK, recieveHelpNeededLocalACK);

  pinMode(ACKButtonPin, INPUT);
  pinMode(alarmPin, OUTPUT);

  next_state = normal_state;
}

void loop(){

  next_state();
}

void normal_state(){
  hasRecievedHelpSignal = false;
    while ( true ){
      if ( hasRecievedHelpSignal ){
        next_state = alarm_state;
        return;
      }
      Spark.process();
    }    
}

void alarm_state(){
  hasRecievedACK = false;
  digitalWrite(alarmPin, HIGH);
  while ( true ){
    if ( hasRecievedACK ){
      digitalWrite(alarmPin, LOW);
      next_state = normal_state;
      return;
    }
    if ( digitalRead(ACKButtonPin) == HIGH ){
      digitalWrite(alarmPin, LOW);
      Spark.publish(m_helpNeededACK, watching);
      next_state = normal_state;
      return;
    }
    Spark.process();
  }
}

void recieveHelpNeededSignal(const char* event, const char* data){
  if ( strcmp(watching, data) == 0 ){
      hasRecievedHelpSignal = true;
  }    
}

void recieveHelpNeededACK(const char* event, const char* data){
  const char* e = event;
  const char* d = data;
  Spark.publish(e, d);
  //if ( strcmp(watching, data) == 0 ){
    hasRecievedACK = true;
  //}
}

void recieveHelpNeededLocalACK(const char* event, const char* data){
  Spark.publish(event, data);
  //if ( strcmp(watching, data) == 0 ){
    hasRecievedACK = true;
  //}
}

Code for base:


#include "messages.h"

const char* WATCHDOG_ID = "HandlerOne";

/*** Time constants ***/
const unsigned long secondsBetweenHelpSignals = 10;
const int resendHelpTimeout = 30;

const int maxTimeAxisStationary = 10;

/*** Pin numbers ***/
const int redLightPin = D0;
const int orangeLightPin = D1;
const int axisSensorPin = D2;
const int localACKPin = D3;
//const int enableHelpSignalsPin = D3;

/*** Program logic variables ***/
volatile bool handlerStopped;

volatile bool trafficLightON;
volatile bool axisNotMoving;

volatile int axisLastSeenMoving;

volatile int lastHelpSignalTimestamp;

volatile bool recievedHelpNeededACK;
volatile int helpNeededACKTimestamp;

/*** Misc ***/
typedef void (*state_function)();
state_function next_state;

bool sentStateMessage = false;

void setup(){
  Spark.subscribe(m_helpNeededACK, handleHelpNeededACK);

  pinMode(redLightPin, INPUT_PULLUP);
  pinMode(orangeLightPin, INPUT_PULLUP);
  pinMode(axisSensorPin, INPUT);
  pinMode(localACKPin, INPUT);

  attachInterrupt(axisSensorPin, axisInterrupt, FALLING);

  pinMode(D7, OUTPUT);

  trafficLightON = false;
  axisNotMoving = false;

  axisLastSeenMoving = Time.now();

  lastHelpSignalTimestamp = 0;

  handlerStopped = false;
  recievedHelpNeededACK = false;

  delay(10*1000);

  next_state = normal_state; // Start in normal state

}

/*** Main loop ***/
// Only dispatches between different states

void loop(){
  next_state();
}

/*** Subsciption handlers ***/
void handleHelpNeededACK(const char* event, const char* data){
  if(strcmp(data, WATCHDOG_ID) == 0){
    recievedHelpNeededACK = true;
    helpNeededACKTimestamp = Time.now();
  }
}

/*** Helper functions ***/
void checkTrafficLights(){
  if( (digitalRead(redLightPin) == LOW) || (digitalRead(orangeLightPin) == LOW) ){
    trafficLightON = true;
  }
  else{
    trafficLightON = false;
  }
}

void checkAxisMoving(){
  if ( (Time.now() - axisLastSeenMoving) > maxTimeAxisStationary){
    axisNotMoving = true;
  }
  else {
    axisNotMoving = false;
  }
}

void axisInterrupt(){
  axisLastSeenMoving = Time.now();
}

void isHandlerRunning(){
  checkTrafficLights();
  checkAxisMoving();
  handlerStopped = trafficLightON || axisNotMoving;
}

/*** State functions ***/
void normal_state(){
  while ( true ){
    isHandlerRunning();
    if ( handlerStopped ){
      next_state = alarm_state;
      return;
    }
    Spark.process();
  }
}

void alarm_state(){
  recievedHelpNeededACK = false;

  while ( true ){

      if ( recievedHelpNeededACK ){
        next_state = ack_stop_state;
        return;
      }

      if ( digitalRead(localACKPin) == HIGH ){
          Spark.publish(m_helpNeededLocalACK, WATCHDOG_ID);
          helpNeededACKTimestamp = Time.now();
          next_state = ack_stop_state;
          return;
      }

      isHandlerRunning();
      if ( !handlerStopped ){
        next_state = normal_state;
        return;
      }

      if ( Time.now() - lastHelpSignalTimestamp >= secondsBetweenHelpSignals ){
        Spark.publish(m_helpNeeded, WATCHDOG_ID);
        lastHelpSignalTimestamp = Time.now();
      }
      Spark.process();
  }
}

void ack_stop_state(){
  while ( true ){
    if ( !sentStateMessage ){
      //Spark.publish("ack_stop_state");
      sentStateMessage = true;
    }
    isHandlerRunning();
    if ( !handlerStopped ){
      next_state = normal_state;
      return;
    }

    if ( Time.now() - helpNeededACKTimestamp > resendHelpTimeout ){
      next_state = alarm_state;
      return;
    }
    Spark.process();
  }
}

messages.h:

const char* m_helpNeeded = "SGA-helpNeeded";
const char* m_helpNeededACK = "SGA-ACK";
const char* m_helpNeededLocalACK = "SGA-localACK";

Thanks!

So i realized that my previous entry contained lot of code not relevant to the problem and therefore i made a minimalistic example with the same error. The setup is sort of the same: one Core (Base) will publish an event named “AnEventName” and the other core (Node) will subscribe to that event and publish the event name and data again.

Code for the node:

/*** EventTestNode.ino ***/
void setup() {
    Spark.publish("Node online!");
    Spark.subscribe("AnEventName", handler);
}

void loop() {
Spark.process();
}

void handler(const char* event, const char* data){
    Spark.publish(event, data);
}

And for the base:

/*** EventTestBase.ino ***/

void setup() {
    Spark.publish("Base online!");
}

void loop() {
    Spark.publish("AnEventName", "Information");
    while (true){
        Spark.process();
    }
}

This yields the same result and error.

I find this problem extremly strange. Scince the Spark.publish() is called with the const char* event and const char* data in exactly the same way and the data part gets published correctly I can’t think of why this should not work.

Any ideas, or might this be a firmware bug? Since the handler function is indeed called the event name must have gotten to the Core properly, but mabye the part of the firmware that calls this function fails to supply the correct pointer as an argument?

I had to change the name of event in the handler (to prevent a recursive publish event) but try casting event and data into a String.


/*** EventTestNode.ino ***/
void setup() {
    Spark.publish("Node online!");
    Spark.subscribe("AnEventName", handler);
}

void loop() {
Spark.process();
}

void handler(const char* event, const char* data){
    Spark.publish((String)event, (String)data);
}

1 Like

Casting to a string solved the problem! I still find it strange that the data argument got passed correctly and not the event argument, but I can live with that fact.

The code you provided indeed caused an infinate loop due to recursion but that was expected I guess.

Thanks for your help!

1 Like

If you’re stuck on using the same event name, you could always add a timer/flag in the handler in order to prevent the recursion.

I’m glad to see that you got it working the way you want.

The recursion is not really an issue. I’m later going to compare the event and data fields with locally saved strings and just wanted to make sure that it gets trough correctly, so I won’t be republishing like that in the finished version.