Device ID as a Spark variable

So i tried doing the following:

Spark.variable("id",Spark.deviceID(),STRING);

and it didn’t work so i’m guessing i’m not handling it correctly with the pointer/address stuff…

Any clue? :smiley:

Isn’t it that you’d have to pass in the address of an actual variable?

I’m not aware, that Spark.variable() would take a function return value, since that’ll only be available temporarily, while it’s present on the stack.
But for the sake of being available for later use, you’d have to store it somewhere permanent.

The Spark.deviceID() returns an Arduino String object, not the char array the Spark.variable wants. Try this:

String idStr = Spark.deviceID();
Spark.variable("id",idStr.c_str(),STRING);
1 Like

That's true, but not the reason, otherwise

Spark.variable("id", (Spark.deviceID()).c_str(), STRING); 

should work too, but I guess it won't.

Both did not work and i’m wondering why :smiley:

The compiler has gotten very picky all of a sudden on the const’ness of things. This compiled but is slightly wasteful. A const cast might be better

    String idStr = Spark.deviceID();
    char  idChar[48];
    strcpy(idChar,idStr.c_str());
    Spark.variable("id",idChar,STRING);

Ah nice! Shall try it out later.

It’s weird how something that :eyes: rather simple becomes kinda cumbersome no? :wink:

Thanks @bko for the help as usual!

On a side note, i know deviceID is available for all the messages sent to the cloud but it would be nice to have it as a variable and used in a web app.

Edit: Sorry overread the "not" in this sentence. But if you tried a (void*) to cast the c_str() return value (see code below), it compiles but then you'll run into this problem:

It might seem that 'Spark.variable("id", (void*)Spark.deviceID().c_str(), STRING);' works, but in fact this is a very dangerous misimpression.
When you test this in an isolated environment, it will look fine, but once you transfer it into a reallife environment you will start getting odd behaviour and you don't know why.
The problem lies in the combination of stackbased (automatic) and heapbased (String::buffer) variables.

As long as the stack and heap are not disturbed by other functions or garbage collection, everything works fine, since the place where Spark.variable() retievese the data from is still in takt. But once you start createing and "destroing" Strings - or other heap based variable - you might corrupt your precious 'Spark.variable()'.

I've brewed up a quick test sketch that shows, that the second version is very susceptible for this problem, and the version with idStr seems to do better, but might still have the same issue in long running environments with a lot of nested function calls, if 'idStr' is not a global var.

int blinkLED(String c)
{
    String cmd;
    
    cmd = c.c_str();   // just to create and "destroy" several String instances
    cmd += c;
    
    digitalWrite(D7, !digitalRead(D7));
    return digitalRead(D7);
}

void setup() {
    String idStr = Spark.deviceID();
    
    Spark.variable("id1", (void*)idStr.c_str(), STRING);
    Spark.variable("id2", (void*)Spark.deviceID().c_str(), STRING);

    Spark.function("test", blinkLED);
    
    pinMode(D7, OUTPUT);
}

void loop() {
}

If you run this, both variables will give you the wanted result at first, but after a few calls to Spark.function("test") you'll see some problems with id2. For id1 I couldn't achieve the same result with that, but my feeling tells me, that this should be the same here.

I do not see how this is expected to work.

In the test case, the second variables to Spark.variable() have limited scope.

By the time setup() exits, the storage referenced is no longer valid, and it will be deceptive because it may work until something else uses the heap and/or stack.

Not sure what the best route to fix is.

I would prefer that Spark.deviceID() is called and read then returned each time Spark.variable() is called.

But device id doesn’t change at all and saving it re-using shouldn’t post much issue.

I’m just thinking if we can improve on Spark.variable() way of taking in values.

@AndyW, this are exactly the points I wanted to point out with this - not meant to work - example code :wink:

Have you read all my blurb, I posted alongside the code? :eyes:

I think a good practice is to make the pointer type arguments global scope and statically allocated. So strcpy() into the char array rather than mess with the pointers.

@kennethlimcp you can always get the core ids from here:

curl https://api.spark.io/v1/devices?access_token=<<token>>

1 Like

@ScruffR has highlighted a much larger problem, I’ll spin it out into a different topic.

Edit: Here’s the new topic

1 Like