In arduino, dht 21 and 22 are semi compatible and work with eachother.
I have tried multiple library’s and loadout(both with dht21 and 22 libs but get both the same result.
Heres the tinyest code i was able to get.
// This #include statement was automatically added by the Spark IDE.
#include "idDHT22/idDHT22.h"
// declaration for DHT11 handler
int idDHT22pin = D2; //Digital pin for comunications
void dht22_wrapper(); // must be declared before the lib initialization
int t;
// DHT instantiate
idDHT22 DHT22(idDHT22pin, dht22_wrapper);
void setup()
{
t = 0.0;
Spark.variable("temp",&t, INT);
}
// This wrapper is in charge of calling
// mus be defined like this for the lib work
void dht22_wrapper() {
DHT22.isrCallback();
}
void loop()
{
t = DHT22.getCelsius()*100;
//delay(100);
DHT22.acquire();
while (DHT22.acquiring()){};
int result = DHT22.getStatus();
delay(2000);
}
@erikbozelie, though the DHT11 and DHT22 are similar, reading their data is different. The code for idDHT22 was written specifically for the DHT22 since that was the sensor most people inquired about. I may be able to modify the code to work with a DHT11 and create a new library. I’ll see what I can do.
By the looks of it, I guess you only have to copy/paste it to the web IDE, after which you should be able to flash it. If you then wire the sensor as explained in the post, you should be good to go.
@protagonist, the code was written when we did not have “tabs” on the IDE so it is “all inclusive” meaning everything is in the one file. All you have to do is copy/paste the entire gist to a new APP on the IDE. Then, change #define DHTTYPE DHT22 to #define DHTTYPE DHT11. The last missing thing is serial output so you can see the readings!. Change the setup() and loop() code to the following:
void setup() {
dht.begin();
Serial.begin(9600);
}
void loop() {
delay(2000);
f = 0;
h = dht.readHumidity();
t = dht.readTemperature();
if (t==NAN)
Serial.println("Temperature reading is bad");
else {
Serial.print("Temperature is ");
Serial.println(t);
Serial.println(" F");
}
if (h==NAN)
Serial.println("Humidity reading is bad");
else {
Serial.print("Humidity is ");
Serial.print(h);
Serial.println(" %");
}
}
Then, verify the code and flash it over to your core. Assuming your core is connected via USB, open a terminal program and connect to the core’s COM port and you should see the readings.
Thanks again. so something is happening but its probably not the Temperatur: 1103101952. The Humidity is 1108606976. Does someone have an idea whats wrong?
EDIT: Sorry. the Problem was that I tried to send it to a web app and called a integer not a float. now I get 25°C and 35% RH . At least the Temperatur is Correct
Bought new DHT22’s, but now… there is a new problem.
I am trying to migrate 2 seperate codes together and think i found the problem.
Uncommenting “UDP Udp;” provides my spark with a red led boot loop.
Any idea why?
Thanks in advance,
Erik
// This #include statement was automatically added by the Spark IDE.
#include "idDHT22/idDHT22.h"
// declaration for DHT11 handler
int idDHT22pin = D4; //Digital pin for comunications
void dht22_wrapper(); // must be declared before the lib initialization
// DHT instantiate
idDHT22 DHT22(idDHT22pin, dht22_wrapper);
double temp = 0.0;
double hum = 0.0;
TCPServer server = TCPServer(23);
TCPClient client;
//UDP Udp;
void setup()
{
Spark.variable("temp",&temp, DOUBLE);
Spark.variable("hum",&hum, DOUBLE);
}
void dht22_wrapper() {
DHT22.isrCallback();
}
void loop()
{
DHT22.acquire();
while (DHT22.acquiring())
;
DHT22.getStatus();
temp =DHT22.getCelsius();
hum = DHT22.getHumidity();
delay(2000);
}
@erikbozelie, running both TCPServer and TCPClient will cause an out-of-RAM error (red flashing). Each takes a lot of RAM for buffering. There are lots of ways to NOT use these. Can I ask what you’re trying to do so perhaps I can suggest a different approach?
In short, setting up communication with java on a local network.
spark broadcasts a udp packet for which java listens to.
Java then takes the ip from that packet and setups a tcp connection.
This code works.
However getting the 2 codes to work together creates a red led.
@erikbozelie, the Spark Team has been working on reducing RAM usage with great success during this sprint. The code is available for local builds but it will be released soon to the web IDE.