Quick question… Which temperature sensor was used in this example? Thanks!
I had no sensor connected. Just testing the wifi reliability.
Hey guys, my core has been up for 1.5 days+ measuing temp without loosing it’s connection. The only change from above is from “LastCloudCheck > 1000605” to “LastCloudCheck > 1000603”.
The sensor I’m using is the TMP36GT9Z
I’m trying to use the code above to send temperatures from my Spark Core to Xively but I’m not seeing any PUTs in Xively. I can trigger a PUT using cURL but not from the Spark Core. I’ve copied and pasted the code above and replaced the feed id, api key and datastream id with my channel name, but nothing shows up in Xively. So… I have a couple of questions:
- How can I see the Serial.println() values? I have CoolTerm installed and my Spark Core is connected via USB but I can’t see any serial ports called usb*…? What am I doing wrong?
- Is there any other way to debug the TCPClient connections supposedly being made by the Spark Core to Xively?
Any help would be much appreciated… This is driving me somewhat crazy!
Perhaps the code in this post can help you further. After a lot of trying, my core is now working perfectly with Xively.
Thanks @Wildfire, but that’s pretty much identical to what I have…
Do you know where I could read the Serial.print messages? If I could do that I might be able to debug this a little better…
I’ve made a small platform named OpenSensorCloud ( http://www.opensensorcloud.com ) , one of the reason being my frustration on not being able to post on Xively even if curl was working fine.
So use it if you want, and contact me directly if you have any issue, I can send some code that I use to update the platform.
API is quite simple, and you can put the ApiKey either in the header on in the url which is sometime easier
@tomsoft I would love to see some code to use your platform. This is exactly what I need to do some sensor logging. It looks like xively is a pain to use.
If your core is connected to your computer via the USB cable you can monitor the serial messages easily! If you’re on Windows, you’ll need the drivers ( http://docs.spark.io/connect/#connecting-your-core-connect-over-usb ), but on Mac and Linux it’ll work out of the box.
The Spark-CLI has an easy to use serial monitor built in ( https://github.com/spark/spark-cli ), you can just type
spark serial monitor
i’ve put a sample here:
You just need to register, and get your API_KEY (in your profile page) and your Device Id (in your device page)
@tomsoft awesome, thank you!
Thanks for responding. I actually managed to get it working after figuring out how to see the serial debug messages. I might try to modularize and generalize my code like you have so I can use it with a variety of sensors.
Thanks for that information. I’m using a Mac and after asking about how to see serial messages in another thread, I was pointed at ‘screen’ in terminal which works a treat. I’ll see if I can get the Spark-CLI working as well as that looks really useful too.
Just an FYI this code is great but XIVELY wont report data accurately unless you alter it slightly.
That is to fast for Xively to properly GRAPH the data. So your method will report the data to Xively but it is to fast to graph it. For those who want to use Xively’s graphing tools change 1000 to 5000. Perhaps I am the only one who uses graphing in Xively but wanted to share in case it might help another.
You guys should check out www.Ubidots.com its way better than Xivley . I’ve tried both.
Check out this thread where I used it recently http://community.spark.io/t/melexis-contact-less-infrared-sensor-code-port/7091/6
Thanks @rwb for sharing I am definitely going to check it out. Looks like they have a far superior data layout.
@Herner here is the setup guide for the Spark Core + Ubidots. Its what I used to get started.
You might find this example useful as well: Logging temperature and humidity using the Spark Core + Ubidots.
Feel free to ping us if you have any questions, or create a topic in our community portal.
This is an old discussion, but may still be a good resource for people, so I am commenting. The part of the code where it checks current
millis() vs the
LastUpTime should fail when
millis() exceeds the max value that can be stored as an
unsigned long This is a big number but is really 4294967295 milliseconds, which equals less than 50 days. On the Arduino you can use a casting like this to avoid the rollover problem:
if ((unsigned long)(currentMillis - previousMillis) >= interval
But I am not sure if this syntax also works with Spark.IO…!