Spark Dev not showing publish or functions

I was having trouble with my own program’s publish and subscribe statements not showing up in the Spark Dev “Show cloud functions”. I added a Spark.variable. And still the functions didn’t show up nor did the Spark variable when I chose 'Show Cloud variables".

I did compile and download each of these.

So I took the “Control LEDs over the net” example ( http://docs.spark.io/examples/ ) and copied it into Spark Dev and that function there doesn’t show up on the “Show Cloud functions” window. I added:
Spark.variable(“temperature”, &temperature, DOUBLE);
to this file. And “Show Spark variables” doesn’t show anything.

Again, I did successfully compile and download this. Which makes me wonder on another point. I added that last line, but I didn’t define a temperature variable. How come that “&temperature” didn’t cause an error?

What am I not doing? I assume others are seeing Spark variables and Spark functions in the two Spark Dev windows?

Are you sure they are actually ending up on your Spark Core.

Try this. In the bottom of your setup() put the following:

RGB.control(true);
RGB.color(255,0,127);
delay(1000);
RGB.control(false);

If your core shows a pink/magenta light after startup (after flashing cyan). You know the firmware flashed. Obviously you’ll need to change the color with each new flash or else you won’t be able to tell the difference but it’s an easy way to quickly sanity check your firmware uploads

Thank you! I’ve been cautious about changing the main LED, but you bring up that it should only do that after showing that it has reset and come back online.

But, I don’t see the pink/magenta. It goes to a breathing cyan and stays there. :frowning:

So it doesn’t look like it is flashing the new code. Very similar to the previous error where I thought I had to do a factory reset to be able to reprogram the part. It looks like I’ve got that problem again. Or it never left. But I thought I downloaded a series of programs that showed changes on the Core.

This is one my Cores with a green solder mask.

Are you setup to program your core over USB via the Spark CLI?

What happens when you do go into the Spark Dev IDE and flash to the core. Does it every turn magenta to indicate a Over the Air firmware update has occurred/is occuring?

Okay, I did this again to verify. I do see it cycle through the red/blue of downloading and then the breathing cyan/green/breathing cyan of linking up again. But that pink/magenta doesn’t appear.

And I do see a “flashing via cloud” and “Update Started” message in Spark Dev. Was there supposed to be a “Update Finished” that I don’t see?

I tried to install Spark Cli again, in part so I could check and see if I could see the Spark variables there, to verify things. But I’m still getting errors no matter what I tried.

Since I thought I had Spark Dev working, I went back to using it.

I get some flashing that is both the red and blue on at half? power at the same time. Is that magenta? Now I’m beginning to wonder if I’ve seen magenta before or not. I thought I had been doing good downloads with Spark Dev.

Maybe I need to try another virgin Core, although I’ve only got a few more of those.

Or do a factory reset on the current culprit and watch for magenta.

Okay, I did a factory reset and proceeded to download the code that I mention above with the function call. I pulled out the variable because I was indeed getting compile errors.

I see the download at least start on the screen and the LED does a bunch of flashing of the red and blue segments together. Is that magenta?

However, when it starts up, I see the green LED on D7 flash and that is code that I had previously in this particular part, but the current program I’m trying to download doesn’t have that.

Would you mind trying the web ide @ https://spark.io/build ?

Copy and paste your code in to a new project there and try flashing it using that interface. Perhaps your Spark Dev setup isn’t quite right. Or perhaps something crazy is going on with spark’s OTA (over the air updates) function now

Okay, I did another factory reset and verified that I could then turn the LED off and on. I’m assuming that verifies that the Core was reset and the program space cleared and then loaded with the Tinker program. I then tried to program it with the ButtonTest2.ino program that has a Spark.variable and the Spark.publish statements. This is not the LED program I described above, but the one I was originally trying to program.

Now I do see my Button variable in the Spark Dev window. And it is correctly a string. But it shows up as a diamond, w, another diamond symbol and the numeral 3.

I’m not sure how it gets that out of “OFF” or “ON”. The first is what it is originally defined as.

It looks like my problem of not being able to program a Core without doing a factory reset is back, or never left. :frowning:

The web IDE was where I first saw that I was unable to program a part without doing a factory reset. So I switched to Spark Dev and it looked like that problem disappeared. But if it did, it has reappeared.

Since I do now see a Cloud variable, I can at least try to change that to see if I’ve had a good download and program, except I’m not sure why the value it shows isn’t what I set the variable to.

Mind posting some code :slight_smile: ? Perhaps I can help with the mysterious spark variable

Here it is. The line that is important is the original definition of the string variable Button. This was to test showing up as a variable and it is not even operated with or on. Sorry for the size, but right now I’m not even sure what is significant. :frowning:

/* This Spark program simply tests the pushbutton installed on pin D2,
by lighting the onboard LED tied to D7 */

int LED_PIN = D7;
int PBUTTON = D2;

// Global Variables
int buttonState = 0;             // Variable for reading button
int oldState = 0;
char *Button = "OFF";

void setup() {
    pinMode(LED_PIN, OUTPUT);
    pinMode(PBUTTON, INPUT);

    digitalWrite(LED_PIN, HIGH);
    delay(1000);
    digitalWrite(LED_PIN, LOW);
    delay(1000);

//	Serial.begin(9600);
//delay(20000); //Give me a chance to do the "sudo cat /dev/ttyACM0" thingy

//	tfButtont.begin();

  Spark.variable("Button", &Button, STRING);
//Set up Spark.publish() so that the state of the local switch
//  is published to the Spark Cloud PRIVATELY
   Spark.publish("Joes_Bar", "State", 0, PRIVATE);

//Set up Spark.subscribe() so that the state of Core1's Led
//  is recorded and handled by LedToggle
   Spark.subscribe("Joes_Bar", ledToggle, MY_DEVICES);

}

void loop(void) {


//    digitalWrite(LED, HIGH);
//    delay(1000);
//    digitalWrite(LED, LOW);
//    delay(1000);

// Read the state of the pushbutton
  buttonState = digitalRead(PBUTTON);

// If a touch is detected, turn on the LED

  if (buttonState == LOW) {
    digitalWrite(LED_PIN, HIGH);
  } else {
    digitalWrite(LED_PIN, LOW);
  }
// if button has just been pushed, turn on LED and send out new state
   if (buttonState == HIGH && oldState == LOW) {
      //write the appropriate HIGH/LOW to the LED pin to turn it ON/OFF
      digitalWrite(LED_PIN, HIGH);
      //publish the state to the Spark Cloud as ON/OFF
      Spark.publish("Joes_Bar", "ON");
      Button = "ON";
      oldState = buttonState;
  }
// If button has just been released, turn off LED and send out new state
   if (buttonState == LOW && oldState == HIGH) {
      digitalWrite(LED_PIN, LOW); // Turn off LED
//publish the state to the Spark Cloud as ON/OFF
      Spark.publish("Joes_Bar", "OFF");
      Button = "OFF";
      oldState = buttonState;
  }

//	  if(digitalRead(PBUTTON) == LOW) digitalWrite(LED, HIGH);
//    if(digitalRead(PBUTTON) == HIGH) digitalWrite(LED, LOW);
    delay(1000);

}

//handler function for Spark.subscribe()
void ledToggle(const char *toggle, const char *onOff){
    if (strcmp(onOff, "ON") == 0){ //if sendLed on Core1 is ON according to Spark.publish()
        digitalWrite(LED_PIN, HIGH); //then turn on Led
    } else if (strcmp(onOff, "OFF") == 0){ //if sendLed on Core1 is OFF according to Spark.publish()
        digitalWrite(LED_PIN, LOW); //then turn off Led
    }
    digitalWrite(LED_PIN, HIGH);
}

I’ve edited your post to properly format the code. Please check out this post, so you know how to do this yourself in the future. Thanks in advance! ~Jordy

Some things spring to mind immediately - sorry @harrisonhjones I don't mean to interfere here, but to give you some slack ;-).
@MarkSHarrisTX, the most severe one is that you should not use the ampersand & in this line

Spark.variable("Button", &Button, STRING);

since Button already is an address.

And you shan't do this Button = "ON"; or Button = "OFF"; since this does not update your Spark.variable.
You'd need to do something like strcpy(Button, "ON"); instead, but for this I'd rather declare char Button[5] = "OFF";.

I'm not sure if the TTL parameter of publish() is already in effect, but having a time to live of zero seconds for your event seems a bit short.
To confirm whether your program works, you might want to try PUBLIC events/subscriptions first.
Furthermore you should maybe consider hooking up your subscription before your first publish to avoid race conditions.

And for your event handler you might never see the LED off, since you immediately turn it back on after you switched it off. And your loop() tempers with your LED state, too.
Just comment out the last digitalWrite() in the event handler and maybe use Harrison's suggestion to employ the RGB LED for "debugging".

As for your button you should either use INPUT_PULLUP or INPUT_PULLDOWN (depending how you wired the button) instead of INPUT to avoid floating readings while the button is not pressed.


Some coding tips:
You can simplify this

  if (buttonState == LOW) {
    digitalWrite(LED_PIN, HIGH);
  } else {
    digitalWrite(LED_PIN, LOW);
  }

into this

  digitalWrite(LED_PIN, !buttonState);  // write invers buttonState to LED
  // or if this doesn't work due to HIGH being 0x0001 rather than 0xFFFF
  digitalWrite(LED_PIN, buttonState ? LOW : HIGH);

or this

  if (buttonState == HIGH && oldState == LOW) {
    //write the appropriate HIGH/LOW to the LED pin to turn it ON/OFF
    digitalWrite(LED_PIN, HIGH);
    //publish the state to the Spark Cloud as ON/OFF
    Spark.publish("Joes_Bar", "ON");
    Button = "ON";
    oldState = buttonState;
  }
  // If button has just been released, turn off LED and send out new state
  if (buttonState == LOW && oldState == HIGH) {
    digitalWrite(LED_PIN, LOW); // Turn off LED
    //publish the state to the Spark Cloud as ON/OFF
    Spark.publish("Joes_Bar", "OFF");
    Button = "OFF";
    oldState = buttonState;
  }

into

  if (buttonState != oldState) {
    digitalWrite(LED_PIN, buttonState);
    strcpy(Button, buttonState ? "ON" : "OFF");
    Spark.publish("Joes_Bar", Button);
    oldState = buttonState;
  }

As for your color questions further up:

When you see the red/blue combo on the RGB LED, this is magenta.

I'm not aware of Cores featuring a green LED on D7, this usually is blue - unless there was a change in production :wink:


For formating your code could you please check out this post

2 Likes

Thank you! But ouch. A number of things to change. Got to go right now. Back this afternoon.

I come from an EE background. Obviously my coding could use some work.

I do have an external 4.7K resistor on that LED pin. In part because I wasn’t sure how to turn on the internal one and wasn’t sure if it would be sufficient. It’s wonderful to be able eliminate a component.

Hi @MarkSHarrisTX

The thing that trips up a lot of folks is that when you declare a pointer to something like char *Button = "OFF" you are not really allocating RAM for a string. You are creating a pointer to something you already allocated elsewhere, and in the case the array “OFF” lives in read-only flash, not RAM. As @ScruffR said char Button[5] = "OFF" and strcpy is the way to go.

2 Likes

Thank you. I was wondering if that might be the problem when I saw that strange reading of the string, but I thought I’d seen that statement in one of the Spark examples.

I’d love to see more examples of working Spark publish, subscribe and variable statements.

Hi @MarkSHarrisTX

In the Spark example, they don’t change the string so it is OK.

If you look in the forum under the Tutorials section, you will find a lot information on publish, function, and variables. I have written several tutorials there and I know there are many more.

http://docs.spark.io/firmware/ says "ttl (time to live, 0–16777215 seconds, default 60) !! NOTE: The user-specified ttl value is not yet implemented, so changing this property will not currently have any impact.".

So I was assuming a default of 60 seconds which seems to be plenty of time. Or has then been superseded somewhere?

Okay, I can do that. I was trying not to bother anyone else. Eventually I'll have to move to 'private' for the actual product.

Okay, that makes sense if the system can tolerate being subscribed to something that doesn't exist yet. Seems like a chicken and egg situation. I thought I'd seen an example that put the publish before the subscribe, but I may be mis-remembering. I'll change this.

Wow. That eliminates a lot of statements and is fairly clear. It does take care of both the switch closing and opening, which I had to check.

Oops. I was mis-remembering and writing without checking that point.

Again, Thank you!

1 Like

Still, if it is a bad thing to do, it shouldn't be in an example. If it is just missing something because this was just a code snippet, then more understandable. But that is why I've been looking for examples.

Thanks, I'll look again. Maybe I just missed them the first time.

When arguing about TTL I was refering to Spark.publish("Joes_Bar", "State", 0, PRIVATE);

AFAIK there is no need for an even having been published before you could subscribe to it.
One nice thing to know about subscriptions is that you don't actually subscribe for on particular "topic" but for any "topic" starting with your "filter". In your case you'd get any event starting with "Joes_Bar". Choosing a most likely unique prefix is also a way to not bother others.


It's not a "bad thing" per se, but if you use it in a wrong way. But the erronous characters you saw came not from the use of a read-only string (which can be useful in some circumstances), but from your use of the ampersand & in Spark.variable("Button", &Button, STRING);.
These were the character representation of the address (which is like an unsigned long int) of the strings location in memory.

1 Like