Example: Logging and graphing data from your Spark Core using Google

Like many other people, one of the more fun things to get your :spark: Core to do is interact with the real-world with sensors.

The beauty of the Core, perhaps over the arduino, is the ability to access this sensor-read data from anywhere :smile: without the need for additional Shields / punching holes in NAT-routers etc etc.

A few people have asked how they can log data, and draw graphs from it - so this weekend I spent some time finding out how to do this with the spreadsheets available in Google Drive.

What you’ll need:

  • Working :spark: Core - with sensor(s) - that you expose using a Spark.variable (basic example below).
  • A Google account, so that you can access Google Drive
  • A bit of Google Apps Script (aka Javascript) magic (again, basic example below).

I used JSON to format the data from my :spark: Core, so that it was already in a format that’s easily parse-able by the Google Apps Script.

char resultstr[64];
void setup()
    pinMode(A0, INPUT); // setup A0 as analog input
    pinMode(A1, INPUT); // setup A1 as analog input
    // expose your char buffer to the Cloud API
    Spark.variable("result", &resultstr, STRING); 

void loop()
    int data1 = analogRead(A0); // read some data
    int data2 = analogRead(A1); // some some other data
    // format your data as JSON, don't forget to escape the double quotes
    sprintf(resultstr, "{\"data1\":%d,\"data2\":%d}", data1, data2); 
    delay(1000); // wait for a second

Test this works with Postman (or your fav RESTFUL client app). You should see a request like this…


…return something like:

    "cmd": "VarReturn",
    "name": "result",
    "result": "{\"data1\":23,\"data2\":26,}",
    "coreInfo": {
        "last_app": "",
        "last_heard": "2014-02-17T20:37:53.300Z",
        "connected": true,
        "deviceID": "YOUR-DEVICE-ID-HERE"

In Google Drive:

  • Create -> Spreadsheet
  • Tools -> Script Editor
  • Add the code below, amended accordingly
  • Resources -> Current Project’s Triggers
  • Filling: collectData - Time Driven - Minutes Timer - Every 15 minutes

Google Apps Code for Script Editor:

function collectData() {
  var  sheet = SpreadsheetApp.getActiveSheet();

  var response = UrlFetchApp.fetch("https://api.spark.io/v1/devices/YOUR-DEVICE-ID/result?access_token=YOUR-ACCESS-TOKEN");

  try {
    var response = JSON.parse(response.getContentText()); // parse the JSON the Core API created
    var result = unescape(response.result); // you'll need to unescape before your parse as JSON
    try {
      var p = JSON.parse(result); // parse the JSON you created
      var d = new Date(); // time stamps are always good when taking readings
      sheet.appendRow([d, p.data1, p.data2]); // append the date, data1, data2 to the sheet
    } catch(e)
      Logger.log("Unable to do second parse");
  } catch(e)
    Logger.log("Unable to returned JSON");

This then connects to your :spark: Core every 15 minutes, parses the data returned, and appends it to the spreadsheet auto-magically. You don’t even have to keep the spreadsheet open :smiley:

TIP: Don’t make the frequency toooo frequent, otherwise the spreadsheet fills up pretty fast!!

As with other Google Drive spreadsheets, it’s then dead easy to go Insert -> Chart, to create a graph that automatically updates as your data comes it.

The code can probably be improved somehow, but I hope that’s of some use to someone :slight_smile:


Sounds good. I have a high accuracy Temp & Humidity Sensor arriving tomorrow and I’m looking for a good way to log and graph my data. Xivley does a excellent job of accepting and logging my data but getting there is not real easy way that I have found so far that allows me to graph that data out how I want it.

There is Plotly.com but the free accounts limit you to 20,000 data points.

I bet there are plenty of graphing functions built into Excel that should be fairly easy to use.

Do you have any examples of graphs that you have created with this process?

I’ll give it a try soon as I get my temp/humidity sensor setup.

I did it with low accuracy temp and humidity sensors, i.e. the good old DHT11 and a lm335z. It took the readings from inside my airing cupboard all night (every 15 mins).
I’ll try to post the graph it produced.

Great Work… It’s work for me… I am using DHT11. I am reading Temp and Humidity and also i created Graph.


1 Like

Post a screen shot of your graph if you can @Yasin

Great stuff! I am gathering data from an 18B20 temperature sensor and I will let run for a day and plot the data. I also want to setup another core with a DHT22 temp/humidity sensor to do the same thing but it’s getting late. :koala:

@RWB Please look below image. This sheet takes data automatically every hour with DHT11.


And here is mine from an 18B20 temperature sensor running overnight, sampled at 15min intervals.


Cool. The graph looks nice.

I’ll post some once i get this up and running.

And mine:

This is sweet! Totally going to use this. :slight_smile:

1 Like

@binaryfrost That looks very nice! Can you share alittle about how you created this style of graph? Did you do this in Google or with Excel on your laptop?

@RWB This was done from Google Drive, from within the same Spreadsheet which was populated by the Apps Script / :spark: Core.

It’s called a ‘Smooth Line Chart’. Google provides a number of nice capabilities for smoothing, trending, pie, line, bar, etc.

I don’t use Excel, as I’m a GNU/Linux user.

Thanks @Dave - glad you found it useful.


This is really cool! Another cool thing to do with this is make something on a streaming webpage about it so anyone can see it. I’d recommend http://misoproject.com/dataset/tutorials/googlespreadsheets for getting the data out of google spreadsheet into a page, and then something like D3 or NVD3 to visualize that data. If you publish the page (File > Publish to the web) and give the key to me, I’ll try to build something with it :smile:

One thing I’m not sure of is if publishing it will publicize the Google Apps Code with your access token and device ID. I should hope not, but you should publish it, save the publish link, log out of your google account or talk to someone you trust and try to get that code via the link. I’m not familiar enough with the google apps setup or have a good data input to get stuff from to try it myself.


@BDub @binaryfrost @peekay123

Below is the result I get back when I do a GET request from the code above. Its not pulling the correct temp_f or humidity data. I’m not sure what I have wrong but I have tried a few things with no luck.

Here is my code, see anything wrong? :

unsigned long wait = millis();              //Delay timer to create a non-delay() x-second sampling timer
const unsigned long waittime = 30*1000L;

// Specify data and clock connections and instantiate SHT1x object
#define dataPin  D0
#define clockPin D1
SHT1x sht1x(dataPin, clockPin);
int led = D7; // this is our onboard LED Pin
char resultstr[64]; //For JSON

void setup()
   Serial.begin(9600); // Open serial connection to report values to host
   Serial.println("Starting up");
   pinMode(led, OUTPUT);
   Spark.variable("result", &resultstr, STRING);  // expose your char buffer to the Cloud API
void loop()
  float temp_f;
  float humidity;

  if (millis() > wait) {
    wait = millis() + waittime;

    // Read values from the sensor
    temp_f = sht1x.readTemperatureF();
    humidity = sht1x.readHumidity();
    // Print the values to the serial port
    Serial.print("Temperature: ");
    Serial.print(temp_f, 1);
    Serial.print("F. Humidity: ");
    // format your data as JSON, don't forget to escape the double quotes
    sprintf(resultstr, "{\"temp_f\":%d,\"humidity\":%d}", temp_f, humidity); 
    delay(1000); // wait for a second
    digitalWrite(led, HIGH);
    digitalWrite(led, LOW);
    digitalWrite(led, HIGH);
    digitalWrite(led, LOW);

try %f for floats, %d is for ints

reference here: http://www.cdf.toronto.edu/~ajr/209/notes/printf.html

@Dave Thanks! That fixed it up!

I have everything up and running now! AWESOME.

I learned quite a few things by following this easy tutorial when it come to getting data avaliabe via the spark cloud and then how to do a GET request with Postman.

I’ll post some graphs later on.


I found a few cool features that the Google Drive Spreadsheet has when it comes to graphing your Spark Core imported data.

You can create gauges if you want, see below:

Google actually will create code for you in the Graph setup box that will allow you to paste a Image or a actual interactive live graph that you can post to any webpage to be viewed by anybody. This is really cool and something that was really hard to do with Xivley.

You can find the code to post your graph online by click the little box at the top right of the graph and then once the drop down pops up choose Publish Chart and it will give you a choice between interactive versions or a simple PNG image of the graph. Very cool.

You can take that code and give it a try to see what it will look like if you posted the HTML to a webpage by pasting it here: http://www.draac.com/htmltester.html

I’m very excited to see how I and others put this to use. Its by far the easiest way to get data out of the Spark Core and onto a webpage in a nice looking format. Very exciting! :smiley:

And another graph I just found that is pretty cool.


Glad you’re finding it useful @RWB.
I’ve never used Xivley, but the flexibility Google Drive gives certainly makes life easy!

@binaryfrost Yea I have no need for Xivley for now thanks to your tutorial.

Another graph showing the different environments I put the sensor in.