Uploading Spark data to Xively

I’ve been sending data to Xivley for almost 24 hours now but I had to manually rest the core 4 times due to the CC3000 not being able to automatically reconnect to the Wifi network when the Wifi signal was lost and then reappeared or when the data connection was lost but the WiFi was not which caused the flashing Cyan LED and a lockup.

For a application where you only want to send sensor data to the Xivley it seems that the only way to maintain a somewhat reliable connection to the Wifi is to have the Spark Core Power Source cycle OFF and then back ON every 5 or 10 mins which will cause the Spark Core to sucessfully reconnect to the WiFi and the Cloud every 5 or 10 mins to reset any freezing up of the Spark Core that might happen during that time period.

It would be nice if we could have one of the Digital Pins on the Spark Core trigger a reset but it seems when the core freezes up it stops running the loop which would also eliminate the triggering of the pin that triggered the reset of the Spark Core itself.

Any ideas?

I did run it for over 24h without issues and then it stopped again… I think another way to do this (that I need to try) is to not have the Spark send data but have a Web server somewhere pull data from the Spark on a regular basis (cron job) and send that data to Xively… It’s ugly, but I wonder if that would make the spark more stable.

I think its the CC3000 wifi chip that is causing the lockups which can only be resolved with a hard reset.

For us the only solution is to find a way to internally or externally automatically reset the spark core to ensure we keep a good connection to the internet by resetting the whole unit. My spark core always connects successfully after a hard reset.

The real solution is fixing the issues with the CC3000 chip, which is seems like everybody is having issues with even Adafruit.

Check out this code that triggers the Core’s micro processor to reset every 10 mins. I am running it now and it does reset / restart the core every 10 mins. Not sure if it will work during a Cyan lock up or a flashing Green lockup but thats what I’m trying to figure out now.

I just changed the name of the Xivley feed from bedroom_temp to Sensor_Temp to start a new graph to track this experiment.

Here is the link to the code, try it and report back if you can: https://community.spark.io/t/can-the-spark-core-trigger-the-reset-pin/2693/5

Thanks - I’m running this too - but the temperature is way too high, despite following the correct formula. Should be around 20C, but coming out at 61C !

Do you think its a Xivley conversion issue? I’m just forwarding raw ADC data to Xivley with no sensor hooked up so I’ have no idea whats going to show up on Xivley once I do start sending sensor data.

Quick question… Which temperature sensor was used in this example? Thanks!

I had no sensor connected. Just testing the wifi reliability.

Hey guys, my core has been up for 1.5 days+ measuing temp without loosing it’s connection. The only change from above is from “LastCloudCheck > 1000605” to “LastCloudCheck > 1000603”.

1 Like

The sensor I’m using is the TMP36GT9Z

Hi all,

I’m trying to use the code above to send temperatures from my Spark Core to Xively but I’m not seeing any PUTs in Xively. I can trigger a PUT using cURL but not from the Spark Core. I’ve copied and pasted the code above and replaced the feed id, api key and datastream id with my channel name, but nothing shows up in Xively. So… I have a couple of questions:

  1. How can I see the Serial.println() values? I have CoolTerm installed and my Spark Core is connected via USB but I can’t see any serial ports called usb*…? What am I doing wrong?
  2. Is there any other way to debug the TCPClient connections supposedly being made by the Spark Core to Xively?

Any help would be much appreciated… This is driving me somewhat crazy! :smile:

1 Like

H Codefrenzy
Perhaps the code in this post can help you further. After a lot of trying, my core is now working perfectly with Xively.

Thanks @Wildfire, but that’s pretty much identical to what I have… :frowning:

Do you know where I could read the Serial.print messages? If I could do that I might be able to debug this a little better…


I’ve made a small platform named OpenSensorCloud ( http://www.opensensorcloud.com ) , one of the reason being my frustration on not being able to post on Xively even if curl was working fine.
So use it if you want, and contact me directly if you have any issue, I can send some code that I use to update the platform.

API is quite simple, and you can put the ApiKey either in the header on in the url which is sometime easier

@tomsoft I would love to see some code to use your platform. This is exactly what I need to do some sensor logging. It looks like xively is a pain to use.

Hi @codefrenzy,

If your core is connected to your computer via the USB cable you can monitor the serial messages easily! :slight_smile: If you’re on Windows, you’ll need the drivers ( http://docs.spark.io/connect/#connecting-your-core-connect-over-usb ), but on Mac and Linux it’ll work out of the box.

The Spark-CLI has an easy to use serial monitor built in ( https://github.com/spark/spark-cli ), you can just type spark serial monitor


i’ve put a sample here:

You just need to register, and get your API_KEY (in your profile page) and your Device Id (in your device page)

@tomsoft awesome, thank you!

Hi @tomsoft!

Thanks for responding. I actually managed to get it working after figuring out how to see the serial debug messages. I might try to modularize and generalize my code like you have so I can use it with a variety of sensors.

Thanks anyway!


Hi @Dave!

Thanks for that information. I’m using a Mac and after asking about how to see serial messages in another thread, I was pointed at ‘screen’ in terminal which works a treat. I’ll see if I can get the Spark-CLI working as well as that looks really useful too.


1 Like