I know the temperature sensor project idea has been beaten to death, but I did it anyways. In the long term, I’d love to have environmental sensors scattered all around my house and garden(s). But, hey, it’s the shortest day of the year (in the northern hemisphere anyways), and I’ve barely had my Cores for a couple of days. This project is really to just kick the tires and get a feel for the Core.
I basically have a 10K thermistor hooked up with a 10K resistor voltage divider. It’s connected to A0 to get the readings. The “Instant” metric is the latest reading from A0 without any processing. The “Average” metric is 50 sample readings averaged together for a smoother trend. I only pull the raw values from the Core since I have JavaScript that can offload some processing to the browser client and save time and space on the Core. I modified the thermistor code from here to perform the calculations in JavaScript. The one modification I had to make was to account for the Core returning a value from 0 to 4095 instead of 0 to 1023 like the Arduino. In that case, the Temp = log(((10240000/RawADC) - 10000)); became temp_k = Math.log(((40960000/temp_raw)-10000)).
The browser makes two calls every X seconds (currently 3) to get the latest values and plots them using the Highcharts JavaScript library. Here’s a screenshot of what that looks like.
The code running on the core is below. Please note that I am not a seasoned C programmer, but I know just enough to be dangerous. It’s quick and dirty, but most importantly, it works! I’d welcome any feedback on how to improve it and become a better C programmer. I did try to comment everything thoroughly for anyone that might be able to learn from it.
// Something to hold the raw temperature
int temp_raw = 0;
// Variables used for averaging/smoothing
const int num_samples = 50;
int samples[num_samples];
int smp_index;
int smp_total;
int smp_average;
void setup() {
// Set our input pin
pinMode(A0, INPUT);
// Expose variables to the cloud
Spark.variable(“temp_raw”, &temp_raw, INT);
Spark.variable(“smp_average”, &smp_average, INT);
// Pre-populate the smoothing array
for (smp_index = 0; smp_index < num_samples; smp_index++)
samples[smp_index] = 0;
}
// One of the very few times an infinite loop is encouraged
void loop() {
// Get the raw reading
temp_raw = analogRead(A0);
// Subtract the current index’s value from the total
smp_total -= samples[smp_index];
// Replace the current index value with the latest reading
samples[smp_index] = temp_raw;
// Add the latest reading (now in the current index) to the total
smp_total += samples[smp_index];
// Increment the index
smp_index++;
// If the index is now higher than the number of samples, go back to the beginning
if(smp_index>=num_samples)
smp_index = 0;
// Calculate the average
smp_average = smp_total/num_samples;
// Give the poor core a little chance to catch its breath
delay(1);
}
I’ve added in a DHT22 temperature + humidity sensor and a one-wire DS18B20 sensor. I have yet to try any capacitors to see if it helps fix the thermistor readings.
All 3 sensors are being polled in the same sketch. Now all I need to do is buy more sensors and Spark Cores and start putting them into “production” around my house!
i have ported the OneWire Ds18b20 sensor example. its right here unfortunately for now you have to copy and past all of that to get it to work. Someone else has taken that and ported the dallas temperature library and thats here. I haven’t used that yet but i expect it to work
I actually used an older port of @tidwelltimj's library. It looks like there have been some changes in the pastebin link. I'll have to check those out!
I have posted the HTML+JavaScript+PHP code I use to graph the sensor data in a GitHub repository at https://github.com/wgbartley/spark-sensor-graph. Take a look at that and see if it helps!
I’m curious, are you compiling/programming the core locally or using the online build tool? If the build tool, do you have a copy of the sketch you are running on the core posted anywhere? I’d be interested to take a look at that too
I would like to get the average of the last 5 readings and what follows is what I came up with:
//changed from an 'int'
double temp1 = 0;
double temp2 = 0;
double temp3 = 0;
double temp4 = 0;
double temp5 = 0;
double avgtemp=0;
void setup()
{
Spark.variable("temp1", &temp1, DOUBLE);
Spark.variable("temp2", &temp2, DOUBLE);
Spark.variable("temp3", &temp3, DOUBLE);
Spark.variable("temp4", &temp5, DOUBLE);
Spark.variable("temp5", &temp5, DOUBLE);
Spark.variable("avgtemp", &avgtemp, DOUBLE);
pinMode(A0, INPUT);
}
void loop()
{
temp1= ((((analogRead(A0)*3.0)/4095.0)-0.5)*100.0); //first temp value
delay(1000);
temp2= ((((analogRead(A0)*3.0)/4095.0)-0.5)*100.0);//second temp value
delay(1000);
temp3= ((((analogRead(A0)*3.0)/4095.0)-0.5)*100.0);//trhird temp value
delay(1000);
temp4= ((((analogRead(A0)*3.0)/4095.0)-0.5)*100.0);//forth temp value
delay(1000);
temp5= ((((analogRead(A0)*3.0)/4095.0)-0.5)*100.0);//fift temp value
delay(1000);
avgtemp=(temp1+temp2+temp3+temp4+temp5)/5;//Average of previus 5 readings
}
Using Postman for queriing "avgtemp" and I am getting this as eg. result:
{
"cmd": "VarReturn",
"name": "avgtemp",
"result": **18.90842490842491**,
"coreInfo": {
"last_app": "",
"last_heard": "2014-02-21T13:44:17.486Z",
"connected": true,
"deviceID": "MyID"
}
}
Which could be the correct result, but I do not have all other temp(1,2,3,4,5) to verify.
Questions:
A- is my script making sense?
B- is there a way to get from the GET request all other measurements (temp1,2,3,4,5) in order to verify result?
C-how to truncate the result to 2 decimals? 18,90