Multiple Cores Publishing to One Web Page

This is a mini-tutorial!

Let’s say your have sensor farm of Spark cores all publishing their data to the Spark Cloud. You need a way to see all that data in near real-time, all on one dashboard. So, how can you do it? There are lots of ways but here’s a simple one that you can edit and change to meet your needs.

First a couple of in action screen shots, first for the event named “Uptime”:

And here’s another shot for the event named “Temp”:

As you can see, you enter your event name (or you can hard code if you like), and the web page registers all the cores that are broadcasting that event and builds the table dynamically as the events come in.

So you start off with just the header row in the table and slowly one by one as events come in, the unique core numbers are added to the table. When new data comes in for a core number you already have in the table, only the data and timestamp fields are updated. When you get an event from a new core not in the table, you add a row at the bottom of the table.

No data is stored here permanently–you just get the current values all on one dashboard.

Here’s the code–don’t forget:

  • To put your own access token in this file
  • To keep this private and not put this on the internet!
<!DOCTYPE HTML>
<html>
<body>
    <P>Event name:<input type="text" name="eventNameBox" id="evText" >
    <br><br>
    <table id="dataTable" width="500px" border="2">
      <tr>
	<td> Core ID </td>
	<td> Data </td>
	<td> Timestamp </td>
      </tr>
    </table>
    <br><br>
    <button id="connectbutton" onclick="start()">Connect</button>
 
    <script type="text/javascript">

    function start(objButton) {
        document.getElementById("connectbutton").innerHTML = "Running";
        var eventName = document.getElementById('evText').value;
        var accessToken = "<< access token >>";
	var requestURL = "https://api.spark.io/v1/events/?access_token=" + accessToken;

        var eventSource = new EventSource(requestURL);

        eventSource.addEventListener('open', function(e) {
            console.log("Opened!"); },false);
         
        eventSource.addEventListener('error', function(e) {
            console.log("Errored!"); },false);
         
        eventSource.addEventListener(eventName, function(e) {
            var parsedData = JSON.parse(e.data);
	    var dt = document.getElementById("dataTable");
            var rows = dt.rows.length;
	    var foundIt = false;
	    for(var i=0;i<rows;i++) {
	      var rowN = dt.rows[i];
	      if (false==foundIt  && rowN.cells[0].innerHTML==parsedData.coreid) {
		foundIt = true;
		rowN.cells[1].innerHTML = parsedData.data;
		rowN.cells[2].innerHTML = parsedData.published_at;
	      }
	    }
	    if (false == foundIt) {
	      var newRow = dt.insertRow(rows);
              var cell1 = newRow.insertCell(0);
              var cell2 = newRow.insertCell(1);
              var cell3 = newRow.insertCell(2);           
	      cell1.innerHTML = parsedData.coreid;
	      cell2.innerHTML = parsedData.data;
	      cell3.innerHTML = parsedData.published_at;
	    }

        }, false);
    }
    </script>
</body>
</html>

One of the big differences between Spark.variable() and Spark.publish() is that publishing only requires one network connection per event stream back to the Spark Cloud.

16 Likes

bko, you are so good at these tutorials. Thanks for your fantastic contributions!

2 Likes

I second that opinion. I'm actually a little jealous that you are cranking out this content so quickly! The web is my specialty, and you're beating me at it! That's fine, though. Competition means more good stuff for users. However, the volume of tutorials is snowballing. I need to get back on the wiki-ish research so we can store this stuff without it getting stale.

2 Likes

&bko Fabulous! Thank you. This is most helpful but could you add a few lines to echo the data back to the cores? With that, I think I can insert my database access stuff and really be set. Thanks again for the responsiveness.

1 Like

&bko, I apologize but I can’t seem to locate the spark-core code that corresponds to this html example. I would appreciate the link again. Thanks.

Hi @captnkrunch

This tutuorial is all about the web side, so you can use it with just the public event stream, looking at over folks public events. If you want to look at temperature, this is good start:

http://docs.spark.io/#/examples/measuring-the-temperature

You will need to do something more like this

// -----------------
// Read temperature
// -----------------

// Create a variable that will store the temperature value
int temperature = 0;
unsigned long lastTime = 0;
char printStr[30];

void setup()
{
  // Register a Spark variable here
  Spark.variable("temperature", &temperature, INT);

  // Connect the temperature sensor to A7 and configure it
  // to be an input
  pinMode(A7, INPUT);
}

void loop()
{
  // Keep reading the temperature so when we make an API
  // call to read its value, we have the latest one
  temperature = analogRead(A7);

  unsigned long currTime = millis();
  if ( currTime-lastTime > 5000) {  //every 5 seconds
    lastTime = currTime;
    sprintf(printStr, "%d", temperature);
    Spark.publish("Temperature",printStr);
  }
}

Once again I am very grateful. Thank you. I really do understand the general purpose of the previous html program but you should be sensitive to us simple-minded arduino programmers who actually believed the purpose of the spark-core was to communicate. Thus, I highly recommend, until the documentation is actually usable, that almost all examples carry both sides of the equation. That will make it easier for us non-linux, non-webhead customers to deduce the actual meaning (and in many cases, the existence) of the many mysterious constructs that are apparently central to spark-core use. By having both sides, it also increases the confidence required to experiment.

3 Likes

Hi,
Thanks, by following along, I am able to publish data but how would I publish a state i.e “ON” or “OFF”?

I believe by using String stringone = “ON” I am on the right track but how do I set it up so it passes through snprintf?
Thanks!

If you’re passing a simple string such as “ON” or “OFF”, you won’t need to pass it through the sprintf(...) in the code above. You can simply do something like:

String myState = "ON";
Spark.publish("MyState", myState);```

Add your own `if ... else` logic to determine what the state really is.
1 Like

@wgbartley
Thanks!
I will give it a try!!

Dup

bko again a great tutorial!!! .Many thanks for that… I understand it is possible to use DS1860 temperature sensors on the Core. Is there a library just like the one in the Arduino world? If so where can I find it?

Thanks again for your great help, André

Hi @pa0akv

I use the DS18B20 sensors all the time–they are great parts. I don’t know the DS1860, is it similar?

The Spark team is working on including the one-wire library just like other built-in libraries, but it is not there yet. In the meantime @Dave has some good code on github in a gist:

1 Like

HI @bko
may i know where to locate the HTML file?

Hi @JAlonso

The HTML file is in the very first post at the top of this thread–just keep scrolling up. You need to save it and edit in your access token.

sorry im new here, i do saw the code, but what is the access token? code together with tcp client code? does it store in .html file? i didnt know where to place the html code.

You first have to download the HTML code, which you save in a new file with the .html extension.
Your accestoken can be found in the settings of the web IDE. You need to copy it, and paste it into the HTML file where it says “<< access token >>”.
After that, you should be able to just double click your newly created HTML file, which should then open in a browser.

Let us know if you need any more help!

1 Like

thank you @Moors7 and @bko

2 Likes

Hi, @bko its me again. the html file shown CoreID is not mine and i am not sure whose the data belong to. :smile: why is that? can i control my core (like on or off pin) through html file in my pc? btw i did https://community.spark.io/t/tiny-webserver-code/3297/ works well. thanks a lot :smile:

Hi @JAlonso

You are looking at the public event stream with the URL I used in this tutorial but by changing the URL to include a devices part and possibly a core id, you can look at only your devices or only a specific device.

Here’s the doc section:

http://docs.spark.io/api/#reading-data-from-a-core-events

2 Likes

Hi @bko can i just ask, how can i publish multiple cores over the web page on a google maps? Like if multiple cores are active, it will display multiple markers on the google maps? Thanks!