Can't publish() and subscribe

I am trying to publish and subscribe to an event. But spark subscribe publishStreamTest mine just says listening.

####I followed these:

  1. spark-publish/
  2. tutorial-getting-started-with-spark-publish/3422
  3. cant-get-my-core-to-publish/3436

####Read this:

But none of them help :frowning:

Here’s my code:

unsigned long lastTime = 0UL;
char publishString[64];

int sec = 10000;
int min =  5000; 
int hours = 1000000;

void setup() {

void loop() {
    unsigned long now = millis();

    if (now-lastTime>15000UL) {
        lastTime = now;

        sprintf(publishString,"Pin: %d - State: %d - Delay: %d",hours++,min++,sec++);

Code complies and flashes without any problems.

Thanks for the help!

Hi @sunilobj

I looked for your event in the public stream of events and did not see it.

So I just flashed your code onto my core and it worked fine. Here is some output from curl:

event: publishStreamTest
data: {"data":"Pin: 1000000 - State: 5000 - Delay: 10000","ttl":"60","published_at":"2014-04-16T02:55:04.218Z","coreid":"50ff73065067545640270287"}

event: publishStreamTest
data: {"data":"Pin: 1000001 - State: 5001 - Delay: 10001","ttl":"60","published_at":"2014-04-16T02:55:19.221Z","coreid":"50ff73065067545640270287"}

Could you have a router problem?

Can you use other Spark features like variables and functions successfully?

If you run your code on your core, we should be able to see your events go by, but as I said above, I did not see them.

Glad that code is working fine.
Thanks for the suggestion :smile:

Checked if variable is working:


int analogvalue = 0;
double tempC = 0;
char  *message = "my name is spark";

void setup()
  // variable name max length is 12 characters long
  Spark.variable("analogvalue", &analogvalue, INT);
  Spark.variable("temp", &tempC, DOUBLE);
  Spark.variable("mess", message, STRING);
  pinMode(A0, INPUT);

void loop()
    // Read the analog value of the sensor (TMP36)
    analogvalue = analogRead(A0);
    //Convert the reading into degree celcius
    tempC = (((analogvalue * 3.3)/4095) - 0.5) * 100;

Flashed it to core. Polled for variable mess
output from curl:

  "ok": false,
  "error": "Variable not found"

I also did spark variable list
My core has 0 variables or is offline - but core is breathing cyan and the flashed code has variable.

But when I do spark cloud list
My core is online

I guess I have a router problem.

You might want to check the status of port 5683 on your router. From the doc page:

The only change you may need to make to your router is to open up outgoing port 5683, the default CoAP port the Spark Core uses to connect to the Spark Cloud. If your core flashes cyan and occasionally flashes red, router issues are likely the culprit.

From my experience, if outbound TCP 5683 is blocked, you won’t even get to the breathing cyan part. However, different routers could work in mysteriously different ways. Double-NAT, IDP, etc. We do a lot of VOIP here at work, so we’ve seen some very interesting issues with customer equipment and ISPs.

I don’t suppose you have a different Core to test or can try a different network with a different router/firewall? Have you tried testing with just a single Spark.variable() or Spark.publish() with simple, static values?