Reading Spark Variables with Your Own HTML File

If you have seen my Spark.publish() tutorials, you know that I like to have private web pages (since if it was public, your access token would be exposed) that read and even graph data from my Spark core. Well, you can do similar things with Spark.variables() and here’s how.

Declare Yourself

Let’s say you have a Spark.variable declared and you want to read it. In my case, it is a temperature variable from a DS18B20 temperature sensor and I have converted the value to a double-precision floating-point value. My declaration in setup() looks like this:

  Spark.variable("temperature", &sparkTempF, DOUBLE);

Be Safe!

OK, so how do we read this variable on a web page? Easy. First a word of warning! We are going to be putting an access token in this file so it cannot be public. You can have it on your local computer and access it via a file:///path/to/file/name.html URL, or you can do like I do and put the file on a service like Dropbox so that I can access it from any computer or phone or iPad that I use Dropbox on. If you need a public way to do this, I would look at using a PHP proxy or other intermediary to hide your token.

Web Magic

We need one little magic ingredient and that is a dash of AJAX and jquery! AJAX is a way to use Javascript to do things asynchronously on a web page. If you have ever watched a web page of a “live blog” for an event like a product announcement where someone is updating a blog in real-time and you are seeing the updates in near real-time, that is AJAX. There are lots and lots of examples and ways to use it, but we need the simplest way, an HTTP GET request.

Normally your browser does not allow a web page to fetch data from another web source and show it to you. What would happen if those awful spammers could send you a link to their bogus web page for your bank, but it could connect to the real web page for your bank and listen in as it showed you their site? You would lose a lot of money, that’s what! So to prevent that, your browser does not allow a page on web site A to fetch URLs from web site B, but that is exactly what we need to do in this case! The solution is AJAX and a safe way to fetch only certain data in JSON format. Luckily for us, the Spark team understood this and all Spark variables return their data in JSON format. The Spark cloud is also setup for this type of access using tokens (instead of cookies) to identify users.

Here’s the first example. Set your device ID and access token and variable name (if it is not “Temperature”) and load this up in a browser and click the “Read Temp” button. Here is what that looks like (I have covered over part of the URL here–just fill in from there):


Every time you click the "Read Temp" button, you will get a new variable value from your core and a new timestamp. It is interactive at your command.

The HTML Code

Here is the code for that–don’t forget to replace the deviceID and access_token values with yours and keep this private.

<!DOCTYPE HTML>
<html>
  <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js" type="text/javascript" charset="utf-8"></script>
<body>
    <span id="temp"></span><br>
    <span id="tstamp"></span><br>

    <button id="connectbutton" onclick="start()">Read Temp</button>
 
    <script type="text/javascript">

    function start(objButton) {

        document.getElementById("temp").innerHTML = "Waiting for data...";
        document.getElementById("tstamp").innerHTML ="";
        var deviceID = "<< device id >>";
        var accessToken = "<< access token >>";
        var varName = "temperature";

        requestURL = "https://api.spark.io/v1/devices/" + deviceID + "/" + varName + "/?access_token=" + accessToken;
        $.getJSON(requestURL, function(json) {
                 document.getElementById("temp").innerHTML = json.result + "&deg; F";
                 document.getElementById("temp").style.fontSize = "28px";
                 document.getElementById("tstamp").innerHTML = json.coreInfo.last_heard;
                 });
    }
    </script>
</body>
</html>

Ok, so the magic is using the AJAX jquery api and then using the getJSON function to do the work. Since the Spark variable data is already in JSON format, in the callback function we can just use the function argument like a struct to access the different fields. We access the result field, which is the published data and we also use the last_heard field in the coreInfo field, which is a GMT timestamp for the variable read that is supplied by the Spark cloud.

Blurring the Lines

Ok, so you are still not satisfied, are you? Remember that live-blog example above where AJAX can be used to update a web page you are viewing in near real-time? We can do that do!

So we can make a web page that automatically reloads the variable and we can set the number of seconds between updates. Now we have a polling loop that is somewhat like publish, but you want to be careful here. We need to be good neighbors on the Spark cloud and not poll too often. Your temperature sensor really can’t change quickly and polling every 10 seconds is plenty fast. In fact, polling every 15 minutes would probably be better for the cloud and not really any different for you, but I want you to see the time stamp update and know it is working.

Here is the a screenshot:

Now here is the code:

<!DOCTYPE HTML>
<html>
  <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js" type="text/javascript" charset="utf-8"></script>
<body>
    <span id="temp">Waiting for data...</span><br>
    <span id="tstamp"></span><br>
 
    <script type="text/javascript">

      window.setInterval(function() {

        var deviceID = "<< device id >>";
        var accessToken = "<< access token >>";
        var varName = "temperature";

        requestURL = "https://api.spark.io/v1/devices/" + deviceID + "/" + varName + "/?access_token=" + accessToken;
        $.getJSON(requestURL, function(json) {
                 document.getElementById("temp").innerHTML = json.result + "&deg; F";
                 document.getElementById("temp").style.fontSize = "28px";
                 document.getElementById("tstamp").innerHTML = json.coreInfo.last_heard;
                 });
    }, 10000);
    </script>
</body>
</html>

We got rid of the button and we just run the Javascript when the page loads now, but we have wrapped the getJSON code from the above example with a window.setInterval() wrapper that calls that Javascript every 10,000 ms or every 10 seconds.

Final Thoughts

Ok, so the Spark core can push data using publish and the web can pull data using AJAX and jquery. Which is better? I think if you want to receive periodic data from the core, push is better. If you want more interactivity then polling to pull data might be the way to go. But just polling on a schedule, as in the second example here is probably not the best way.

Just remember to be a good Spark cloud neighbor and not go crazy trying to pull data too fast. I know, for instance, that my house is about 140 ms away from the Spark cloud, up and back, so trying to poll for data faster than every second or two just does not make sense. Plus the sensor cannot change meaningfully in that short a time.

So go have fun and read your variables where ever you want now! Just remember to stay safe with that access token!

23 Likes

Thanks for another great tutorial!
the window.setInterval just automated my ajax resquests :slight_smile:

1 Like

I am targeted to do yard work today, and this new project appears. I hope it rains.

Thanks for the example, AND for the detail in the logic and reasoning. People who are micro skilled, but not WEB skilled get valuable front end training from this, allowing us to be capable of reading and understanding training books!

2 Likes

Again, @bko, these are just fantastic, easy to understand tutorials!!! Thank you so much for the time you put into them!

I’m now wondering if all variables on a core can be displayed on the same page, since it seems like those would be multiple $.getJSON statements on the same page - not sure if that’s gonna work. Also, all variables on a core can be enumerated in a JSON response if the url doesn’t have a variable specified, so I wonder if those variable can be grabbed and then put into a for-each statement to display all variables. Glad it’s raining this evening!

1 Like

Hi @brianr

Yes you can easily display all the variables from a core on the same page, but each variable needs its own $.getJSON since there is no cloud query that returns the values of all of them all in one JSON. You can discover all the variables and functions on a core using a GET request on https://api.spark.io/v1/devices/<<device id>>/?access_token=<<access token>>

The JSON that comes back looks like this:

{
  "id": "<<device id herer>>",
  "name": "core_name_here",
  "connected": true,
  "variables": {
    "aSparkVariableHere": "int32"
  },
  "functions": [
    "somefunctionhere"
  ]
}

You would need to parse the JSON that gets returned and get the variables field, getting another JSON, parse that, and then loop over the variables.

1 Like

I haven’t gotten to parsing all variables yet, but I’ve gotten multiple variables work wonderfully! I’ll post code in the hope that it helps someone else. I am interested in these two variables on the core;

Spark.variable("volts1", &volts1, STRING);
Spark.variable("count", &count, INT);

And the HTML; Notice that I broke the vars for device ID and accessToken out of the function so that they can be reused. Please feel free to comment on streamlining this code, I’m posting merely cause it works (for me, anyways!). Also, please remember to be a good neighbor with ajax refreshing, as every variable shown in this way is it’s own call to the cloud and could really cause a lot of traffic.

<!DOCTYPE HTML>
<html>
  <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js" type="text/javascript" charset="utf-8"></script>
<body>
    <span id="volts1spn">Waiting for volts...</span><br>
    <span id="countspn">Waiting for count...</span><br>
    <span id="tstamp"></span><br>

    <script type="text/javascript">
	var deviceID = "<<coreid>>;
	var accessToken = "<<access token>>";
		window.setInterval(function() {
        var varName = "volts1";

        requestURL = "https://api.spark.io/v1/devices/" + deviceID + "/" + varName + "/?access_token=" + accessToken;
        $.getJSON(requestURL, function(json) {
                 document.getElementById("volts1spn").innerHTML = json.result + " volts";
                 document.getElementById("volts1spn").style.fontSize = "28px";
                 document.getElementById("tstamp").innerHTML = json.coreInfo.last_heard;
                 });
		}, 10000);
		window.setInterval(function() {
        var varName = "count";

        requestURL = "https://api.spark.io/v1/devices/" + deviceID + "/" + varName + "/?access_token=" + accessToken;
        $.getJSON(requestURL, function(json) {
                 document.getElementById("countspn").innerHTML = "Loop has run " + json.result + " times.";
                 document.getElementById("countspn").style.fontSize = "28px";
                 });
		}, 10000);
    </script>
</body>
</html>
2 Likes

CAn you supply the complete code in the spark side?
All i see is “Spark.variable(“temperature”, &sparkTempF, DOUBLE);”

Here is the code from the Spark documentation page (it is very simple):

// -----------------
// Read temperature
// -----------------

// Create a variable that will store the temperature value
int temperature = 0;

void setup()
{
  // Register a Spark variable here
  Spark.variable("temperature", &temperature, INT);

  // Connect the temperature sensor to A7 and configure it
  // to be an input
  pinMode(A7, INPUT);
}

void loop()
{
  // Keep reading the temperature so when we make an API
  // call to read its value, we have the latest one
  temperature = analogRead(A7);
}

this now works except the conversion temp is wrong.
My attempt doesnt work.
It might also be as some examples use tmp36 and others ds18b20
I have tmp36

// -----------------
// Read temperature
// -----------------

// Create a variable that will store the temperature value
int temperatureraw = 0;
float voltage=0;
float temperature=0;
void setup()
{
  // Register a Spark variable here
  Spark.variable("temperature", &temperature, INT);

  // Connect the temperature sensor to A7 and configure it
  // to be an input
  pinMode(A7, INPUT);
}

void loop()
{
  // Keep reading the temperature so when we make an API
  // call to read its value, we have the latest one
  temperatureraw = analogRead(A7);
 voltage = (analogRead(A7)* 3.3)/4095;
 temperature  = (voltage - 0.5) * 100;
}```

I have DS18B20’s in service right now but I have TMP36 somewhere in the junk box.

If it is reading warmer than expected, the Spark core does dissipate some heat and another user found that heat was conducted on the breadboard to the sensor raising its temperature a few degrees C. If you can isolate the TMP36 from the heat of the core, you may get a more accurate reading.

As to the code, you have changed the variable temperature to a float but you have not changed the Spark.variable to tell the cloud it is a float, the cloud thinks it is an integer. Right now, as the doc shows the only supported types are INT, DOUBLE and STRING. So you should change these lines:

...
double temperature = 0;
...

  Spark.variable("temperature", &temperature, DOUBLE);

You don’t have to read the analog pin A7 twice, you can just use the temperatureraw value times 3.3. It would be good practice to tell the compiler you want the type of temperatureraw converted with a cast to double, so:

temperatureraw = analogRead(A7);
voltage = ( (double)temperatureraw * 3.3)/4095;

If you wrap you code in your posts as is shown in this picture from BDub, it will be much easier to understand!

1 Like

This code now fixed thanks.

// Read temperature for tmp36 ic. conneto to 0v, 3.3v and a0
//result in Celsius 
// -----------------

// Create a variable that will store the temperature value
int temperatureraw = 0;
double voltage=0;
double temperature=0;
void setup()
{
  // Register a Spark variable here
// Spark.variable("temperature", &temperature, double);
  Spark.variable("temperature", &temperature, DOUBLE);

  // Connect the temperature sensor to A7 and configure it
  // to be an input
  pinMode(A7, INPUT);
}

void loop()
{
  // Keep reading the temperature so when we make an API
  // call to read its value, we have the latest one
  temperatureraw = analogRead(A7);
 voltage = ( (double)temperatureraw * 3.3)/4095;
 
temperature  =(voltage-0.5)*100;
}
1 Like

Hi everybody!
I’m pretty new in the spark core community, coming from france.
@bko, thank you very much for all your project share, this is helpful for me!
Do we have a possibility to limit the number of decimal either directly in the variable (spark core code) or in HTML file (with the JSON format)?

There a bunch of way to do that! You could have a string Spark.variable() and use sprintf to control how many digits before and after the decimal you want, for instance. Or you could do some rounding in Javascript.

One other way that has worked well for me is to use a “scaled integer”, that is an integer type where the LSB is not +/-1 but some fraction. Here’s some code:

    double tempC =   ((double)intTemp);
    double tempF = (( tempC*9.0)/5.0+32.0)*10.0;  //times 10.0 for scaled int below
    scaledTemp =  (int32_t)tempF; // 724 means 72.4

Thank you, I have find this, which is very simple

document.getElementById("temp").innerHTML = json.result.toFixed(2) + "&deg;C";

:slight_smile:

3 Likes

I have been using the Spark.variable method to get data but with a node.js script instead of a web page. It’s nice that the example script in the documentation shows you how to simply log in and get a token so there’s no security risk.

How would you connect the node.js script to a web page? I’m a noob when it comes to web apps and node.js.

Also, while I have no trouble parsing the result from the JSON string. When I’ve tried to parse the variable name, I get ‘unexpected token e’ error.

Here’s the code.

var varCb = function(err, data) {
    if (err) {
      console.log('An error occurred while getting core attrs:', err);
    } else {
      console.log('Core attr retrieved successfully:', data);
	  var varName = JSON.parse(data.name);
	  var result = JSON.parse(data.result);
	  adc2temp(Number(result), varName);
    }

It’s modified from edgarsilva’s examples on GitHub

Hi @gin

Maybe @Dave could help out with node.js questions–I would just use Javascript on a web page.

For your JSON question, I think you just need data.name and not JSON.parse(data.name) to get the name. The name itself is not a JSON, it is field in one that has already been parse.

@bko That fixed it. Thanks.

2 Likes

Good question! You can host whole web applications directly from Node.js, and there are a bunch of great modules to help with this. One of the most popular is http://expressjs.com/ . They have great documentation and getting started tutorials here: Redirecting…

I hope that helps! :slight_smile:

Thanks,
David

1 Like

Thanks for all the info you have provided in the forum. I found most of the information in needed in your posts. However, the function that worked for me was the json.coreInfo.last_heard. Can you point me to more information on coreInfo? I am looking to use current time in a thermostat project. Again, thanks for all the information you contribute to this forum. I have learned a lot from you.

Hi @Stickman

If I understand your question right, you want to get the last_heard field from the cloud. That does not come with every API end-point return value. You can use the “devices” end-point to find that for all cores registered to your ID like this:

curl https://api.spark.io/v1/devices/?access_token=<<hex token here>>

which returns an array of JSON structures, one per core:

[ {<<core1 data>>},
  {<<core2 data>>},
...
  {<<coreN data>>}]

where each core data JSON has a last_heard field in these forms:

 {
    "id": "<<hex id number1>>",
    "name": "<<core name1>>",
    "last_app": null,
    "last_heard": null,
    "connected": false
  },
 {
    "id": "<<hex id number2>>",
    "name": "<<core name2>>",
    "last_app": null,
    "last_heard": "2014-11-19T04:03:10.990Z",
    "connected": true
  },