DS18B20 and Particle Core

Hello all,

I have spent some time on troubleshooting a simple code, but so far no luck.
Basically, I have DHT11 and DS18B20 connected to my Core. The DHT11 code works no problem.
When I flash the Core with the following code for DS18B20 :

// This #include statement was automatically added by the Spark IDE.
# include "OneWire/OneWire.h"

OneWire one = OneWire(D5);
uint8_t rom[8];
uint8_t resp[9];

char szInfo[64];

void setup() {
    Spark.variable("tempHotWater", &szInfo, STRING);

void loop() {

    // Get the ROM address
    one.read_bytes(rom, 8);
    // Get the temp
    one.write_bytes(rom, 8);
    one.read_bytes(resp, 9);

    byte MSB = resp[1];
    byte LSB = resp[0];

    int16_t intTemp = ((MSB << 8) | LSB); 
    int16_t celsius =  (intTemp/16);
    int16_t fahrenheit =  ((celsius*9)/5+30);

    sprintf(szInfo, "%2.2f", celsius);

       Spark.publish("tmpinfo", szInfo);


I cannot read tempHotWater variable (it gives me time out, but I can curl it) and it kind of freezes Core for further flash as I cannot flash it any more and have to carry out a hard reset before flashing it with a new code.

Any ideas please?

Would you like to have a look at the DS18x20 library on Particle Build and give it a try?

There are also several threads about this sensor, have a look there too :+1:

I am using both of those sensors on one of my Cores. Use the PietteTech_DHT lib for your DHT sensor and the OneWire lib for you DS18B20…they work great are non-blocking and provide good serial output for debug when you need it.

@ScruffR I have not touched DS18x20 as I only have one DS18B20. Will check it out.
I have browsed pretty much the whole forum and a quite a few threads without any success.

@LukeUSMC Thanks - I can see the DS18B20 sensor via serial output, but I cannot read it. DHT is read without any issues.

This is not right and will generate a compile-time error in the future. You do not need an ampersand for char array string-type variables.

@bko - Brian, very much appreciated, but does not change much I am afraid.

After deleting the ampersand as per your recommendation
curling the Core, gives me:

  "id": "****************************",
  "name": "My_Core",
  "connected": true,
  "variables": {
    "tempHotWater": "string"
  "functions": [],
  "cc3000_patch_version": "1.29",
  "product_id": 0,
  "last_heard": "2015-09-24T13:28:28.168Z"

Now, when I try to curl tempHotWater variable, I get:

  "error": "Timed out."

Obviously, the web application to display the temperature does not work.

If your publish working? I see several “tmpinfo” publishing results in the public stream.

Just a side note - I think there is a version 1.32 out already (hope I’m not mistaken again :blush:)

Yes, publish is working. I have just changed “tmpinfo” for tmpHotWater (Spark.publish(“tmpHotWater”, szInfo):wink: to make it more distinguished.

Just updated - thanks for the heads-up

1 Like

Would you share your code here please? I am still struggling to make both censors work together.

I’ll send you what I have and my spark.publish but I use the libs and it looks like you want to roll your own. Either way, I’m happy to share what I’ve got.

Somewhat out of order, here are the functions I use to pull the data from the sensors using the libs. Keep going and you will see all the pieces you need to use the sensors and publish data in a non-blocking way. I pulled these functions out of a much larger firmware I have been working on but If you piece this all back together it should run, I haven’t done it myself but I try to keep functions fairly tight and serving a single purpose. I use Particle Dev exclusively so lib use in the WebIDE…I can’t help with but here are the links to the libs I am using:
PietteTech_DHT Lib
DS18B20 and OneWire Libs

void getairtemp(){
  if (!bDHTstarted) {		// start the sample
      bDHTstarted = true;
      } //End of DHTstarted IF
      if (!DHT.acquiring())
      {		// has sample completed?
    	int result = DHT.getStatus();
      switch (result) {
    case DHTLIB_OK:
        Serial.println("Error\n\r\tChecksum error");
        Serial.println("Error\n\r\tISR time out error");
        Serial.println("Error\n\r\tResponse time out error");
        Serial.println("Error\n\r\tData time out error");
        Serial.println("Error\n\r\tDelta time to small");
        Serial.println("Error\n\r\tNot started");
        Serial.println("Unknown error");
      airtemp = (DHT.getFahrenheit());
      airhumidity = (DHT.getHumidity());
      n++;  // increment DHT Sample counter
      bDHTstarted = false;  // reset the sample flag so we can take another
      DHTnextSampleTime = millis() + DHT_SAMPLE_INTERVAL;  // set the time for next sample

void getwatertemp(){
      watercelsius = ds18b20.getTemperature();
      waterfahrenheit = ds18b20.convertToFahrenheit(watercelsius);
      DS18B20nextSampleTime = millis() + DS18B20_SAMPLE_INTERVAL;


#include "PietteTech_DHT.h"
#include "DS18B20.h"
#include "OneWire.h"

Defines and vars:

#define DHTTYPE  DHT11             // Sensor type DHT11/21/22/AM2301/AM2302
#define DHTPIN   3         	    // Digital pin for communications
#define DHT_SAMPLE_INTERVAL   4000  // Sample Air Temp every 4 seconds
#define Metric_Publish_Rate 60000 // Publish once per Minute
#define DS18B20_SAMPLE_INTERVAL 2000 //Sample Water Temp every two seconds
DS18B20 ds18b20 = DS18B20(D2); //Sets Pin D2 for Water Temp Sensor
//DHT Wrapper and Lib instantiate
void dht_wrapper();
PietteTech_DHT DHT(DHTPIN, DHTTYPE, dht_wrapper);
//Global Vars
unsigned int DHTnextSampleTime;	    //Int Var to track last Air Temp/Humidity Sample
bool bDHTstarted;		                //True/False flag to indicate if we have started DHT acquisition
int n;                              //Int Var to track number of DHT11 Samples
unsigned int MetricnextPublishTime; //Int Var to track last Publish of Metrics
unsigned int DS18B20nextSampleTime; //Int Var to track last Water Temp Sample
float airtemp = 0;
float airhumidity = 0;
float watercelsius = 0;             //Float Var for Water Temp in Celsius
float waterfahrenheit = 0;          //Float Var for Water Temp in Fahrenheit
char metricdata[64];                //Char Var for Spark.Publish in metricpush function

Setup() stuff:

  pinMode(D3, INPUT); //DHT11 Sensor
  pinMode(D2, INPUT); //DS18B20 Sensor
DHTnextSampleTime = 0; //Set Sample Time for immediate Air Temp sampling
  DS18B20nextSampleTime = 0; //Set Water Temp Sensor for immediate sampling
  MetricnextPublishTime = 30000; //Set Metric Publishing to begin 30 Seconds after starting the main loop

DHT wrapper, mine is between the setup() and loop(), not sure if it matters where it is placed but I bet there is a wrong place.

// This wrapper is in charge of calling DHT Libs must be defined like this for the lib work
void dht_wrapper() {

Function to take sample data from the sensors (if required time has passed as dictated by sensor_SAMPLE_INTERVAL values) then call another function to publish that data according to the Metric_Publish_Rate. NOTE: All time related values are in millis.

void SampleandPub(){
  if (millis() > DS18B20nextSampleTime){
  if (millis() > DHTnextSampleTime){
  if (millis() > MetricnextPublishTime){

Lastly the metricpush() function to send the sensor data to the Particle cloud.

void metricpush(){
  Serial.print("Metric Push!");
  sprintf(metricdata,"Air Temp: %2.2f, Humidity: %2.2f, Water Temperature: %2.2f", airtemp, airhumidity, waterfahrenheit);
  Spark.publish("tempdata", metricdata, PRIVATE);
  MetricnextPublishTime = millis() + Metric_Publish_Rate;
  } //End of metricpush function

Not sure what your level of expertise is but if you cut paste everything as is above you will need to add one more thing in the loop() to actually use all of this.

void loop(){

Thank you @LukeUSMC!
I have nailed what the problem is.

Thank you @bko for directing me. My code above is working OK, the problem comes from the code to display values:

// Update temperature

setInterval(function() { 

 $.get('/get', {command: '/temperature', core: 'sensor_core'}, function(json_da
    if (json_data.result){
      $("#tempHotWater").html("Temperature: " + json_data.result + " °C");
  }, 2000);preformatted text by 4 

2000 is only 2 secs and the request times out.
I tried to put 10000, but the request times out from time to time. So the problem is really connection to the cloud and not the code as I thought.

Just tried it from a datacentre with the same result. The problem is the Java request (or curl request) times out. Any suggestions welcome.

I would try pulling the delay(5000) as that disrupts the connection to the cloud. Delay, While, For or any blocking statements result in your core losing it’s connection to the Particle Cloud and thus your ability to call Spark.variable or Spark.function reliably. Use a millis() + 5000 to set the publish interval or something like that so you don’t hammer the cloud with publishing but in general, blocking causes cloud weirdness. Hope that helps!


While it is a good call to avoid delay() where possible and rather go for non-blocking approaches, delay(5000) should not really break the cloud connection since it implicitly calls Particle.process() regularly (aprox. once per sec).
But it does add to the latency and hence can interfere with OTA flashing.

Well here is an education opportunity for me, delay() calls Particle.process in the background from a Core? I know it is called at the top of the loop but didn’t know delay calls it. Given this is such a short loop/firmware it would seem to reason to me that it is in delay more than anything else. So if he is in the delay portion of the loop and initiates a Spark.variable call will it respond while in delay?

@LukeUSMC, I am having this discussion with @mat and hopefully other Elites. I now believe delay() should be blocking like Arduino in order to force folks to use non-blocking and explicitly coded Particle.process() calls. This would a) make code clearer for debugging (no hidden behavior) and b) lead to better non-blocking coding habits. The only drawback is that so many Arduino libraries use delay() since they have no background process to consider. With multi-threading on the Photon, this will no longer be a concern. :smile: