Like many other people, one of the more fun things to get your Core to do is interact with the real-world with sensors.
The beauty of the Core, perhaps over the arduino, is the ability to access this sensor-read data from anywhere without the need for additional Shields / punching holes in NAT-routers etc etc.
A few people have asked how they can log data, and draw graphs from it - so this weekend I spent some time finding out how to do this with the spreadsheets available in Google Drive.
What you’ll need:
- Working Core - with sensor(s) - that you expose using a Spark.variable (basic example below).
- A Google account, so that you can access Google Drive
- A bit of Google Apps Script (aka Javascript) magic (again, basic example below).
I used JSON to format the data from my Core, so that it was already in a format that’s easily parse-able by the Google Apps Script.
char resultstr[64];
void setup()
{
pinMode(A0, INPUT); // setup A0 as analog input
pinMode(A1, INPUT); // setup A1 as analog input
// expose your char buffer to the Cloud API
Spark.variable("result", &resultstr, STRING);
}
void loop()
{
int data1 = analogRead(A0); // read some data
int data2 = analogRead(A1); // some some other data
// format your data as JSON, don't forget to escape the double quotes
sprintf(resultstr, "{\"data1\":%d,\"data2\":%d}", data1, data2);
delay(1000); // wait for a second
}
Test this works with Postman (or your fav RESTFUL client app). You should see a request like this…
https://api.spark.io/v1/devices/YOUR-DEVICE-ID/result?access_token=YOUR-ACCESS-TOKEN
…return something like:
{
"cmd": "VarReturn",
"name": "result",
"result": "{\"data1\":23,\"data2\":26,}",
"coreInfo": {
"last_app": "",
"last_heard": "2014-02-17T20:37:53.300Z",
"connected": true,
"deviceID": "YOUR-DEVICE-ID-HERE"
}
}
In Google Drive:
- Create -> Spreadsheet
- Tools -> Script Editor
- Add the code below, amended accordingly
- Resources -> Current Project’s Triggers
- Filling: collectData - Time Driven - Minutes Timer - Every 15 minutes
Google Apps Code for Script Editor:
function collectData() {
var sheet = SpreadsheetApp.getActiveSheet();
var response = UrlFetchApp.fetch("https://api.spark.io/v1/devices/YOUR-DEVICE-ID/result?access_token=YOUR-ACCESS-TOKEN");
try {
var response = JSON.parse(response.getContentText()); // parse the JSON the Core API created
var result = unescape(response.result); // you'll need to unescape before your parse as JSON
try {
var p = JSON.parse(result); // parse the JSON you created
var d = new Date(); // time stamps are always good when taking readings
sheet.appendRow([d, p.data1, p.data2]); // append the date, data1, data2 to the sheet
} catch(e)
{
Logger.log("Unable to do second parse");
}
} catch(e)
{
Logger.log("Unable to returned JSON");
}
}
This then connects to your Core every 15 minutes, parses the data returned, and appends it to the spreadsheet auto-magically. You don’t even have to keep the spreadsheet open
TIP: Don’t make the frequency toooo frequent, otherwise the spreadsheet fills up pretty fast!!
As with other Google Drive spreadsheets, it’s then dead easy to go Insert -> Chart, to create a graph that automatically updates as your data comes it.
The code can probably be improved somehow, but I hope that’s of some use to someone