Spark.publish() for two spark cores

Hi,

I have two spark cores - core1 and core2
When a input D0 in core1 is closed, i want to close a relay located on core2. I am think to use a spark.publish() function (i am right or exist another kind of function for that?) as communication between two cores. But after read Spark tutorials, i dont know how to do that with two spark cores.

Somebody can help me!?! :smile:

Thanks you very much!

Check out Spark.publish() and Spark.subscribe() at docs.spark.io :wink:

@kennethlimcp is correct.

Your code would be something like this (psuedo code)
Core 1:

int lastValue = -1;

setup()
{
    pinmode(D0, INPUT); // Pullup or pulldown perhaps?
}

loop()
{
    int newValue = digitalRead(D0);   // also need somekind of debounce code here
    if(newValue != lastValue)    // The value has changed
    {
        if(newValue == HIGH)
        {
            Spark.publish("D0","HIGH");
        }
        else
        {
            Spark.publish("D0","LOW");
        }
        lastValue = newValue;
    }
}

Core 2:

void myHandler(const char *event, const char *data)
{
    if(data == "HIGH")
    {
        digitalWrite(D1,HIGH);
    }
    else
    {
        digitalWrite(D1,LOW);
    }
}

void setup()
{
    pinmode(D1,OUTPUT);
    Spark.subscribe("D0", myHandler, MY_DEVICES);
}
2 Likes

I believe this does pretty much exactly what youโ€™re trying to achieve. Itโ€™s also commented nicely. Definitely check it out!

1 Like