Polling and storing Spark Core data via Batch File

I wanted to put together a small tutorial for people that don’t have in depth knowledge of how the whole API think works. Being in that category, I thought it might be nice to share a bit of the knowledge I gained trying to learn the basics.

This tutorial will show how to set up the core with an irradiance sensor, filter the output, and retrieve and store the result via a batch file.


Tools you will need:

  • a text editor, like Sublime Text
  • the cURL Command Line tool from curl.haxx.se
  • I’d recommend using their download wizard to find the right version for you
  • the JSON Command Line Processor jq

Step 1: Build the Circuit

As far as hardware bill of materials go, this one is pretty short. The schematic and the final result can be seen in the image below.

You need:

  • 3 small wires
  • TEMP5700 ambient light sensor ($0.70 at Digikey)
  • 10k Ohm resistor
  • Your core and breadboard + power source

Wire it all up… and your done with the hardware!

Step 2: Write the Spark Core code

This is very similar to the example provided by the Spark Core team that allows you to pull temperature data via the API. I’m going to post the code and explain a couple of minor differences.

int irradNumReadings = 30;
int irradiance = 0;
int irradianceArr[30];
int irradArrInd = 0;
int irradTotal = 0;

void setup()
{
  Spark.variable("irradiance", &irradiance, INT);
  pinMode(A7, INPUT);
}

void loop() {
  irradTotal= irradTotal - irradianceArr[irradArrInd];   // subtract the last reading:
  irradianceArr[irradArrInd] = analogRead(A7);           // read from the sensor:  
  irradTotal= irradTotal + irradianceArr[irradArrInd];   // add the reading to the total
  irradArrInd = irradArrInd+1;                           // advance to the next position in the array
    
  if (irradArrInd >= irradNumReadings)   // if we're at the end of the array,
    irradArrInd = 0;                     // ...wrap around to the beginning: 
    
  // calculate the average:
  irradiance = irradTotal / irradNumReadings;      
}

The initial 5 declaration define some constants and the variable that we want to eventually pull from the device.

The setup() function just tells the Spark Core that the irradiance variable should be defined in a way that allows us to obtain the data via a curl request initiated by our batch file.

The primary difference between the code here and the temperature measurement example is that I’ve used an averaged output. This helps decrease the variance of the output. To do this, every time I record a new data point, I add it into an array (defined in this example to be 30 data points) and drop the oldest record.

I wasn’t able to find a shift array function, so instead of creating my own and using up a lot of processing power, I just kept track of the index that needs to be replaced. Increment by 1 each time a data point is recorded and when you hit the end, loop right back to the first index again.

Instead of adding up every index in the array, I used a running total, and just added and subtracted from that. Should keep the number of operations to a minimum…

The last thing to do was divide the array sum by the number of indices in the array. This will give us the average irradiance output for the last 30 data points. Play with the numbers a bit for your own sensor. You may find that you need more or less, depending on how accurate your components are.


Step 3: Prepare the Batch Programming Environment

I decided to program in the batch file environment because it was simple and didn’t require me to install a compiler or any server software. These files could also be called from other scripts (ie MATLAB) to perform some tasks without figuring out how to connect it up with a database hosted on my own server… or someone elses.

There are two supplemental items you need to obtain. The cURL command line interface and a JSON command line parser. These are both tiny programs (you only need the *.exe files) that you can drop into the same folder that you’ll be writing the batch file in. In my system, I’m using ‘C:\SparkCore.’ Once you do that, you’re ready to program.


Step 4: Write the Batch Programs

I broke the program into two components. One that loops (forever) and one that performs the request for data. We’ll start with the data request program first:

@ECHO OFF

REM function call: PullCoreData VARIABLE_NAME

REM This file will access the spark core and download the requested variable
REM   In this case, the variable is defined by the function call
REM   The device ID and the access_token has also been hard coded

REM following code retrieves the data, stores result in a temporary file,
REM   reads the file into a parameter and finally deletes the temporary file.
REM -k ignores the HTTPS security
REM -s runs CURL in silent mode 
REM jq is a JSON parser that can be used to extract the desired data from a response that has the correct format

curl -k -s https://api.spark.io/v1/devices/DEVICE_ID/%1?access_token=ACCESS_CODE | jq .TEMPORARY_allTypes.uint32 > tmpFile
set /p TestOutput= < tmpFile
del tmpFile

echo %date:~-10,10% %time%,%testOutput% >> testOutputFile.txt
ECHO %1 = %TestOutput%

All lines that start with REM is a remark (think comment). The batch file version of //.

The first command that is executed is @ECHO OFF. By default a batch file will display every line that is executed. This command turns that ‘feature’ off.

The next command is the bulk of the program, it uses the cURL command line interface to request the data from the spark core.

The first thing to note is that I used a %1 where the variable name typically is. This is how you define parameters that are passed in via the command line. We would pass the name of the variable into the program by calling it in the following manner:

PullCoreData irradiance

Wherever we see a %1, the batch file sees ‘irradiance.’

There are two options that I have used. There was an issue with a digital signature somewhere and the -k allowed the cURL program to ignore this and performs the request anyway. Not sure what caused it, but this was a workaround. The -s runs cURL without the progress bar that shows up when the program is called. There are a lot more options that are available, so feel free to go through the documentation.

You could store the entire response in a file by executing the command:

curl -k -s https://api.spark.io/v1/devices/DEVICE_ID/%1?access_token=ACCESS_CODE  > tmpFile

If you go and open the file that was generated ‘tmpFile’, you’ll see something like:

{
  "cmd": "VarReturn",
  "name": "irradiance",
  "TEMPORARY_allTypes": {
    "string": "\u0000\u0000\u0003O",
    "uint32": 847,
    "number": 847,
    "double": null,
    "raw": "\u0000\u0000\u0003O"
  },
  "result": "\u0000\u0000\u0003O",
  "coreInfo": {
    "last_app": "foo",
    "last_heard": "2013-12-31T18:45:04.302Z",
    "connected": false,
    "deviceID": "YOUR_DEVICE_ID_HERE"
  }
}

We want to continue processing the response using the JSON parser. This is where the remainder of that line comes in:

 curl<...>CODE | jq .TEMPORARY_allTypes.uint32 > tmpFile

This takes the output of the cURL command and pushes it into the jq JSON parser. We only want the value that’s associated with the uint32 data type. This command tells jp to only extract the data we want.

There is no simple way to assign the output of a command to a variable, so a quick workaround is to save it into a file and then import it to a variable. The > tmpFile tells the batch file to write the result to a file called ‘tmpFile’. The next line 'set /p TestOutput= < tmpFile' then pulls it back in to a variable called ‘TestOutput.’ We can then delete the temporary file.

The last two lines write the value to a output log and print the value to the command window. %date% and %time% are internal system variables. We can use them to timestamp our results. The ~-10,10 modifier just takes a subset of the char array output by %date%, defining a start point 10 positions from the end with a length of 10 characters. If you dont mind having a 3 letter day [mon,tue,wed, ect.] prefix, you can leave that out.

echo %date:~-10,10% %time%,%testOutput% >> testOutputFile.txt
ECHO %1 = %TestOutput%

If you want to run the batch file, save the file as ‘PullCoreData.bat’, open the Command Prompt, navigate to the folder you saved the file in and type in:

>PullCoreData irradiance

You should see a response that looks like:

irradiance = 820

If you go open the ‘testOutputFile.txt’ file, you should see something like:

12/30/2013 23:05:58.34,820 

Now we can put a wrapper around this and continuously poll the Spark Core for its current irradiance. The wrapper program is pretty simple:

@ECHO OFF
CLS

for /L %%n in (1,0,10) do (
  @ECHO OFF
  PullCoreData irradiance
)

:stop
call :__stop 2>nul

:__stop

REM Loop forever. Stop the loop by closing the window or by pushing "CTRL+C"

The first line turns off displaying all the commands, and the CLS clears the command prompt screen.

We then create a loop that runs for ever. It basically says to start at 1 and increment by 0 until you reach 10. Not sure if you need the :stop or :__stop lines, but it doesnt seem to harm to keep them there. Once you start this program, the only way to stop it is to hit “CTRL+C” or close the window.

Using MATLAB, I parsed the output file and plotted the irradiance vs time. I live in upstate NY, so clear sunny days are few and far between. The output you see here is not far from normal :smile:. Note that the irradiance is unscaled. I need to do a bit more research before i can convert from the voltage reading from the Spark Core to the actual irradiance in W/m^2

Once I get one of those San Diego days… I’ll post that as well. We’ll see how long that takes.


Thats it!

As long as you have the window open, this should sit there… working away while you go about your day. In my experience, the output is effectively updated every 500mS or so. So you get something along the lines of a 2 Hz sample rate. As long as you dont have some really fast moving data, this should work for you. If you need better resolution, you can probably figure something out by encoding multiple data points into a single output variable.

Hope that helps at least one other person out! I by no means proclaim to be an expert in any of these programming languages, so if someone has a suggestion how to do it better, please feel free to chime in!

8 Likes

Very nicely written up @Pat! Thanks for the great example with Batch files! I love me some batch programming every once in a while. Last time I used a batch file was to automate verification of a compiler for Underwriters Laboratories. I’ll have to grab that JSON command line processor for a rainy day.

Everyone is going to feel most comfortable coding in something they use a lot, but even if you’ve never tried it… you should definitely check out http://nodejs.org/ I’ve just about falling in love with the damn thing. You can run scripts locally, or on servers… or locally and create a server in seconds. Also nice if you know C and/or javascript which is very C-like. You can even run node scripts from batch files :wink:

1 Like

so… i have to say, that was the first time I wrote my own batch file. I’m more familiar with programming webpages and working with database queries. I have a version of this that saves data to a text file located on a server using a webpage, php and a little AJAX, but that only operates when the webpage was open.

I glanced at this node.js thing… it looks pretty impressive, dont know why I havent heard of this before.

For anyone that is interested in learning the basics, I found a decent tutorial that explains node.js and how to use it at www.thenodebeginner.org. I know i’ll be spending a lot of time there in the near future!

2 Likes

Awexome! I was kind of worried you’d think I was derailing the whole batch thing… so I’m glad you are finding Node interesting. Good luck escaping its love grip! xD

1 Like

Not at all! You can only use the tools that you know about, so I appreciate the introduction :slight_smile:

1 Like

I Like it!
I’m still figuring out what’s the best way to log data from the core (once in 10sec would be enough)…

original post inspired me to open up Terminal on my mac for about first time ever and delve into bash - thanks @Pat - (#stackoverflow has been a godsend too). following shell script does the job (though doesn’t add anything to original i am afraid):

#!/bin/bash
# dont forget to chmod a+x shell_file_name

DEVICEID="abc123"
ACCESSCODE="def456"

# user inputs temperature string variable

echo "enter string variable name"
read VARNAME

while true; do
  curl -s https://api.spark.io/v1/devices/$DEVICEID/$VARNAME?access_token=$ACCESSCODE | jq .TEMPORARY_allTypes.string > tmpFile
  read -r TestOutput<tmpFile
  echo $TestOutput
  rm tmpFile
  echo $(date +%D),$(date +%T),$TestOutput >> TemperatureSamples.txt
  sleep 8
done

(as you can see i am returning the temperature as a string from the core via the API. 8 instead of 10 to adjust for curl response time).

So I’ve got a couple more things…

One, I promised a few more figures showing the irradiance:

Jan 2nd - Another Cloudy Day:

Jan 3rd - Finally, a Sunny Day:

Then a little goodie for any of you MATLAB junkies… :slight_smile: You can access the Spark Core data from a script by calling the function urlread():

% Define Access Parameters:
deviceID = 'ABC123';
accessCode = 'DEF456';
variableToRead = 'irradiance';
url = ['https://api.spark.io/v1/devices/' deviceID '/' variableToRead '?access_token=' accessCode];

% Pull Response from Spark Core API
sparkCoreText = urlread(url);

In order to parse the data I wrote up a small set of code that should work for the current incarnation of the cURL response. Warning… its not pretty, but it gets the job done:

  % Do a bit of syntax switching:
sparkCoreData = strrep(sparkCoreText,'{','');
sparkCoreData = strrep(sparkCoreData,'":',' =');
sparkCoreData = strrep(sparkCoreData,'= "','= ''');
sparkCoreData = strrep(sparkCoreData,'",',''',');
sparkCoreData = strrep(sparkCoreData,' "','');
sparkCoreData = strrep(sparkCoreData,',',';');
sparkCoreData = strrep(sparkCoreData,' ','');
sparkCoreData = strrep(sparkCoreData,'=null;','=NaN;');
  
% Create Structures out of the nested {}'s
tagStart = strfind(sparkCoreData,['=' char(10)])+1;
allNewlines = strfind(sparkCoreData,char(10));
while numel(tagStart)>=1
  startAt = tagStart(1);
  tagStart = allNewlines(find(allNewlines==startAt,1,'first')-1);
  tagEnd = strfind(sparkCoreData,['"' char(10)]);
  tagName = sparkCoreData(tagStart:startAt-2);
  tagSubset = [char(10) sparkCoreData(startAt+1:tagEnd(1)-1) ''';'];
  tagSubset = strrep(tagSubset,char(10),[tagName '.']);
  sparkCoreData = [sparkCoreData(1:tagStart-1) tagSubset sparkCoreData(tagEnd+4:end)];
  
  tagStart = strfind(sparkCoreData,['=' char(10)])+1;
  allNewlines = strfind(sparkCoreData,char(10));
end
sparkCoreData = sparkCoreData(1:end-1);
  
% Clean up the workspace a bit:
accessInfo.sparkCoreText = sparkCoreText;
accessInfo.deviceID = deviceID;
accessInfo.accessCode = accessCode;
accessInfo.variableToRead = variableToRead;
accessInfo.url = url;
clearvars -except accessInfo sparkCoreData
  
% Evaluate the string to obtain the returned results: 
eval(sparkCoreData);
    
% Display the result:
disp([accessInfo.variableToRead ' = ' num2str(TEMPORARY_allTypes.uint32)]);

This doesnt replace a full MATLAB JSON Parser, but it will produce the results we want… at this time :smile:

Calling the script gives us the output:

>> PullCoreData
irradiance = 3949
>> PullCoreData
irradiance = 3944
>> PullCoreData
irradiance = 3945