Example: Logging and graphing data from your Spark Core using Google

Send 3D.js data from the Spark Core? Or to Google Drive Spreadsheets?

I think you mean http://d3js.org/ ?

D3 is a great javascript library for making interactive data visualizations. If you wanted to work with that, it’s just a matter of storing your data somewhere, and building a small page or widget with D3 that does something with that data. If I get a chance I’ll whip up a D3 demo if someone doesn’t beat me to it first. :smile:

Thanks,
David

@Dave I’d love to see some different data visualization options.

D3 is amazing, but I have very minimal experience with it.
RWB. If you can put something together, I’ll try incorporating it in my Spark Tools web app.

lol Thanks @Dave I need to be more careful with my typing! Yes [quote=“Dave, post:57, topic:2929”]
I think you mean http://d3js.org/ ?
[/quote]

That is what I mean. Looks great.

I like the “it’s just a matter of…” comment. In an similar vein climbing mount Everest is just a matter of putting one foot in front of the other a few times! :slight_smile:

A Demo / tutorial would be very welcome. Thanks for your positive response.

I’m trying to make that “just a matter of storing your data somewhere” comment true. I have a web application that makes it that simple.
Thread here.

2 Likes

Hi All, Just to share my implementation for a pulse based energy meter based on the code of the topic starter. This was my attempt to get a graph of my home energy consumption based on the light-pulse that my energy meter will let out every watt-hour consumed. By measuring the time between each pulse, you can calculate the current energy consumption. Below is the code I put into the sparc:

// Define the pins we're going to call pinMode on
int sensor = D0;
char resultstr[64];
unsigned long pulsearray[100];
unsigned int pulsearrayindex = 0;
unsigned int storearrayindex = 0;

void blink();
int getLastData(String args);

void setup() {
  pinMode(sensor, INPUT_PULLUP);
  attachInterrupt(sensor, blink, FALLING);
  Spark.variable("result", &resultstr, STRING); 
  Spark.function("getLastData", getLastData);
  Spark.function("getData", getData);
}

void loop() 
{

}

int getLastData(String args)
{
    if(storearrayindex != pulsearrayindex)
    {
        sprintf(resultstr, "{\"time\":%d,\"idx\":%d}", pulsearray[storearrayindex], storearrayindex); 
        
        storearrayindex++;
        if(storearrayindex > 99)
        {
            storearrayindex = 0;
        }
        return pulsearrayindex;
    }
    else
    {
        return -1;
    }
}

// This is a circular buffer of 100 items to store when a pulse happened
void blink()
{
    pulsearray[pulsearrayindex] = millis();
    pulsearrayindex++;
    if(pulsearrayindex > 99)
    {
        pulsearrayindex = 0;
    }
}

And this is what I put into the google drive script to fill an excel file with data:


function collectData() {
  var  sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName("Sheet1");

  while(1)
  {
    var returnval = -1;

    var payload = {
      "access_token" : "YOUR_ACCESS_TOKEN",
      "args" : ""
    };
    var options = {
      "method" : "POST",
      "payload" : payload
    };
    
    var url = "https://api.spark.io/v1/devices/YOUR_DEVICE_ID/getLastData";
    var response1 = UrlFetchApp.fetch(url, options);
    
    try {
      var response2 = JSON.parse(response1.getContentText()); // parse the JSON the Core API created
      returnval = unescape(response2.return_value); // you'll need to unescape before your parse as JSON
    } catch(e)
    {
      Logger.log("Unable to returned JSON1");
    }
    
    if(returnval == -1)
    {
      break;
    }
    
    var response3 = UrlFetchApp.fetch("https://api.spark.io/v1/devices/YOURDEVICE_ID7/result?access_token=YOUR_ACCESS_TOKEN");
    try {
      var response4 = JSON.parse(response3.getContentText()); // parse the JSON the Core API created
      var result = unescape(response4.result); // you'll need to unescape before your parse as JSON
      try {
        var p = JSON.parse(result); // parse the JSON you created

        var d = new Date(); // time stamps are always good when taking readings
        sheet.appendRow([d, p.time, p.idx]); // append the date, data1, data2 to the sheet
      } catch(e)
      {
        Logger.log("Unable to do second parse2");
      }
    } catch(e)
    {
      Logger.log("Unable to returned JSON2");
    }
  }
}

Now, if you run this script directly, it works like a charm, retrieving all entries in the buffer, and adding them to the excel. If you add a column with the following formula in the first cell of column D:

 =ArrayFormula(if(iserror((B2:B - B1:B) = 0),iferror(1/0),if((B2:B - B1:B) = 0,iferror(1/0),3600/((B2:B - B1:B)/1000))))

you can get the instantaneous power, and plot a nice graph, as long as you do not miss any pulses! (check idx for this, it should increment on each entry).

The only problem I have, is that when you let this script run on a timer event, weird stuff starts to happen, that seems to be related to the UrlFetchApp.fetch failing more then 50% of the time for no apparent reason. I actually read up on numerous topics in the Google drive forum related to this problem, and only the response from Google that they would look into it. I guess you can improve the script by having it gracefully detect a failed fetch, and retrying, but I’m afraid it will start bombarding the spark cloud service unnecessary, and you would also need to update the sparc code to gracefully handle the re transmission, as currently it will cause it to skip an entry. In any case, this made the script unusefull for my application, so I’m abandoning this attempt at storing my energy consumption in Google drive. Instead I will try to build a local storage trough a tcp service or something.

But I thought I’d post here my progress so far, for anyone who would like to have another go at it. If you’re able to fix it, I’d be all ears :wink:

Regards,

Robin

1 Like

Hello,
Is there a way to pull the data into the spreadsheet more frequently than 1 minute? I have an accelerometer which I would like to graph.
Thanks!

1st of all this is great. using the exact code provided.
I am testing this out set the time interval to 1 minute(for testing), but i see time between points range from 4 seconds to over 3 minutes…what effects the times between data point being recorded?

I worry if I these are all “missed” collection attempts then choosing a 15minute interval would recorded intervals that vary between 15minutes and 45 minutes.

recorded time intervals

-0:03:23
-0:01:21
-0:02:02
-0:01:58
-0:01:04
-0:03:56
-0:01:13
-0:03:50
-0:01:04
-0:01:02
-0:00:04
-0:01:48
-0:01:05
-0:02:06
-0:00:01
-0:03:55
-0:02:04
-0:00:45

I update temp data every min and the actual imported time stamps to the google drive spreadsheet will vary by about 1 min - 3 mins between time stamps sometimes.

Here is the time stamp on for my data importing:

2/27/2014 15:04:01
2/27/2014 15:05:53
2/27/2014 15:09:02
2/27/2014 15:10:48
2/27/2014 15:12:51
2/27/2014 15:14:59
2/27/2014 15:15:01
2/27/2014 15:17:55
2/27/2014 15:19:57
2/27/2014 15:20:53
2/27/2014 15:25:01
2/27/2014 15:26:47
2/27/2014 15:27:57
2/27/2014 15:28:01
2/27/2014 15:29:59
2/27/2014 15:30:01
2/27/2014 15:33:56
2/27/2014 15:34:55
2/27/2014 15:37:00
2/27/2014 15:37:01
2/27/2014 15:38:59
2/27/2014 15:39:01
2/27/2014 15:41:00
2/27/2014 15:43:00
2/27/2014 15:44:19
2/27/2014 15:46:47
2/27/2014 15:47:55
2/27/2014 15:49:00
2/27/2014 15:49:01
2/27/2014 15:51:55
2/27/2014 15:52:49
2/27/2014 15:53:53
2/27/2014 15:56:51
2/27/2014 15:59:49
2/27/2014 16:01:50
2/27/2014 16:02:49
2/27/2014 16:03:55
2/27/2014 16:04:51
2/27/2014 16:08:38
2/27/2014 16:09:55
2/27/2014 16:10:49
2/27/2014 16:11:49
2/27/2014 16:12:50
2/27/2014 16:15:51
2/27/2014 16:18:55
2/27/2014 16:21:53
2/27/2014 16:23:00
2/27/2014 16:23:52
2/27/2014 16:27:54
2/27/2014 16:28:55
2/27/2014 16:30:01
2/27/2014 16:30:03
2/27/2014 16:31:49
2/27/2014 16:33:00
2/27/2014 16:33:01
2/27/2014 16:34:52
2/27/2014 16:35:53
2/27/2014 16:38:49
2/27/2014 16:40:56
2/27/2014 16:43:02
2/27/2014 16:44:38
2/27/2014 16:46:49

1 Like

what if i wana post my data to a server( google drive ) rather than the server fetching from my device ??? plz reply wana do something cool with it…

1 Like

ok I changed to every 15minutes and it does look like I am seeing failures to read sprinkled into the data points.

diff
-1:09:58
-1:30:02
-0:14:45
-0:15:14
-0:15:00
-0:15:01
-1:15:02
-0:14:59
-0:29:57
-0:30:02
-0:30:04
-0:14:57
-0:14:59
-0:15:00
-0:29:58
-0:15:04
-0:14:46
-1:45:12
-0:15:01
-0:15:03
-1:00:00
-0:14:57
-0:30:03

For me it would not ever import any bad data points it would just not always import the data into the spreadsheet at the exact interval that I set the Spark Core to send it. Its not perfect timing wise but its good enough for logging temp and humidity.

I would like less errors…are these failed read errors common?. Is there anyway to have the script notice when it has encountered either an error or a timeout and try again, so I don’t go 3 or 4 cycles with no data?
I am seeing 50% to 70% failure rates on my script execution. I am not really sure what the trouble is since every time I use my web browser to load the URL I get the correct response from the spark core…but the google script is not having as much luck. when if fails I either get a “timeout” maybe 10% of the time or more commonly ~40-60% of the time an “unexpected error”.
the log is not very verbose about that, here is an example of what I get in the log:

Details:
Start Function Error Message Trigger End
3/29/14 7:04 AM collectData Timeout: https://api.spark.io/v1/devices/my-dev-id/result?access_token=my-token (line 4, file “Code”) time-based 3/29/14 7:05 AM

3/29/14 7:14 AM collectData Unexpected error: https://api.spark.io/v1/devices/my-dev-id/result?access_token=my-token (line 4, file “Code”) time-based 3/29/14 7:15 AM

so I did blank out my details the logs above…log does show real device and key data, and I am using the script just as posted…
to be clear the IS working, just not as reliable as I would hope. and I wonder if this is as good as i can expect.

Story of every developer's life. :slight_smile:

Had to upload it as a picture otherwise half the document dissapeared.
From error-frustration I hacked together this webpage that works with the core’s program up top and pulls data from it in a quite reliable way. I run it from my Dropbox Public map.
Hope I helped someone out!

While still not as reliable as I would hope (the google doc read) I am making this work. Details and ongoing notes can be found here:
https://sites.google.com/site/kzoogarden

Thanks binaryfrost! for sharing this with us. I am having a blast with this Garden Project of mine.

1 Like

Thanks for posting this seems to work reasonably well when the script is ran manually. I am having the quite frequent “unexpected error” when the script is ran on a schedule.

using it for monitoring the temperature of my bedroom with the intention of improving my sleep.

edit: it seems that if I try again after a “unexpected error” it will actually work, so I have modified the google docs script to attempt to access the API to 3 times, this has been running every 10 minutes since 1am today (about 13hours) and the unexpected errors seem to have stopped happening.

    function collectData() {
      var  sheet = SpreadsheetApp.getActiveSheet();
      var response = null; 
      var url = "https://api.spark.io/v1/devices/ID/result?access_token=TOKEN";
      
  var success = false;

  for(var attempts = 0; (attempts < 5 && !success); attempts++) {
      try {
        // call UrlFetchApp.fetch..
        response = UrlFetchApp.fetch(url);
        success = true;
      } catch (e) {
        Logger.log(e);
        Utilities.sleep(1000);
      }
  }


      
    if(success){
      try {
        var response = JSON.parse(response.getContentText()); // parse the JSON the Core API created
        var result = unescape(response.result); // you'll need to unescape before your parse as JSON
    
        try {
          var p = JSON.parse(result); // parse the JSON you created
          var d = new Date(); // time stamps are always good when taking readings
          sheet.appendRow([d, p.Temperature, p.Humidity,p.Presence]); // append the date, data1, data2 to the sheet
        } catch(e)
        {
          Logger.log("Unable to do second parse");
        }
      } catch(e)
      {
        Logger.log("Unable to returned JSON");
      }
      }
    }

Screen shot of todays data, first column is temperature in degrees C, second is Humidity as a percentage, and the final column is binary weather the room is occupied, 0 = empty room, 1 = a person in the room.

4 Likes

This whole thing is very cool. I have been using google docs for years, but did not know about the script function. Can’t wait for my core to arrive to do some cool things. My mind is a racing now. Thanks all!

2 Likes

I could do with a little (lot of) help. I’m trying to display data, provided by the spark core, on a web page which is accessed by several people.
I think what I need is a server side script to get the data from the core and save it in a spreadsheet (or similar), then client side graphing (Highcharts?) to read in the data from the database.
This seems simislar to what is possible with Google sheets and scripts (as per example at start of page) but that seems very unreliable timing wise.

Any assistance on the server side script/high charts would be greatly appreciated.