Is it possible to use a constant with Spark.variable()? I’m trying to output a version number for a url endpoint. Maybe there’s a better way to do this? I want to avoid using a variable if possible. This is what I’ve tried but I get an error:
#define VERSION 1.0
Spark.variable("appVer", VERSION, DOUBLE);
This is not currently available, but would perhaps be nice as a feature request;
When a Spark device comes online, the cloud broadcasts a Server Sent Event with some info about said device (see code below). In this info, the CC3000 patch version is contained. Perhaps it would be nice if it were possible to add a user configurable value to that, which get’s broadcast upon connectivity. One could put his/her version number in it, which would then be always available.
Since @Dave doesn’t seem to have enough work already, let’s ask him what he thinks about this ;)? (All others are free to comment as well, I just like pinging @Dave :p)
I guess the actual location should not make a lot of difference unless you try changing the value of your const variable.
As long you are sure not to change the value, you could just cast the const away
const double version = 1.0;
Spark.variable("appVer", (void*)&version, DOUBLE);
This compiles fine and should work, too - I haven't tested it tho'
@ScruffR Thanks that did the trick perfectly. As a side note, I thought you cannot change the value of a const. Wouldn’t that create a error while compiling?
Exactly - this was the reason for my warning to not even try
But no, if you fool the compiler into believing it is changeable it won’t warn you.
On the other hand even if you cast the const away, the memory location will still be in flash and hence unchangeable and any attempt to write to this “variable” will crash the program.
But if you had a RAM variable once referenced as a const type and once as a changeable (e.g. as union or via pointer) the compiler would just treat the memory location according to the access restrictions imposed on the reference when declared.