Verified:
Fast Read Example (Digital Input): https://gist.github.com/technobly/8573877
Sorry for bumping such old topic but I am stuck on making it works on my spark core so may you please help me here
The module which I used is EmbeddedAdventures SHT25 breakout, bought from here http://www.embeddedadventures.com/sht25_humidity_temperature_sensor_MOD-1018.html
I wired the component as shown in below pic
And then I uploaded your code to the core but Serial seems to print nothing and the variable temp2&humid cannot be read from the cloud. I tried to print out some message in Serial but it failed to display also. What should I do to resolve the problem here?
Thanks much in advanced!
Hi @nhancaheo
For i2c devices, the Spark core requires pull-up resistors on SCL and SDA to pull them up to 3.3V. The value 4.7 kohm is ideal but any value from about 1k to 10k will work.
Thanks bko for looking into!
The module itself have two pull-up 10K resistors on SCL and SDA pins. Does it require another external resistor?
The module datasheet can be found here http://www.embeddedadventures.com/datasheets/MOD-1018_hw_v1_doc_v1.pdf if you may want to read.
If they have it onboard, you do not need to add an external one.
Okay, then! Need to figure out why it doesnât work cause itâs the first time I use I2C on spark
I love your local cloud tutorial. Do you have any idea why Serial doesnât work in latest revision of core-firmware posted on github? For same code, Serial port doesnât appear in my device list when compiling using core-firmware but it does when I used spark.io/build or spark compile. Does it mean library is missing somewhere?
Thanks kennethlimcp!
Serial works and always works. Haha! I use them all the time.
You might need to provide more information for us to take a look.
As long as you have Serial.begin (baud_rate)
, the core should be listed under Com port.
Not sure why it doesnât work on my machine
Here is my application code which works on cloud but not on local core-firmware with make clean all. Interestingly, it works on old build of core-firmware two months ago.
uint32_t counter = 0;
void setup()
{
Serial.begin(9600); // Open serial over USB.
while(!Serial.available()); // Just chill until we Rx a character. Don't chill
pinMode(D7,OUTPUT); // more than 15 seconds or the core will choke.
}
void loop()
{
Serial.print("Spark Core says Hello over USB! ");
Serial.println(counter++);
digitalWrite(D7,HIGH);
delay(500);
digitalWrite(D7,LOW);
delay(500);
}
Where is this placed in?
application.cpp?
Can you look for the line which says SYSTEM_MODE (AUTOMATIC);
and try SYSTEM_MODE (SEMI_AUTOMATIC);
Yes, itâs application.cpp
Added SYSTEM_MODE(SEMI_AUTOMATIC); to the top of the file but same result.
I already tried in another machine but no luck!
Thatâs weird. You might want to update all the 3 repo and test just to be sure.
Iâll test it later when im home.
Indeed. I did clone all three repos core-firmware, core-common-lib and core-communication-lib for testing.
Leave it to you, I can live with old build then and also need to move on my sensor part.
My appreciation for your support!
Hi @nhancaheo
I see that this module requires either i2c clock stretching or depends on read returning ACK or NACK to indicate a successful read or not.
Which method are you trying to use?
Spark has hardware i2c and I donât think clock stretching is in the firmware yet, so you should use the ACK/NACK method.
What software are you using to talk to the module? The built-in wire class?
Hi bko,
How can I color your name as you did?
Iâm not sure on method used. I flashed the code posted by PaulRB at https://gist.github.com/PaulRB/8372742
I looked at his wiring and did the same but seems broken somewhere.
Luckily, I made my sensor working on Arduino 3.3V, which proves that my sensor is not damaged.