Spark.publish vs HTTP POST for intermittent data transfers

HI, I’m running my core off solar and batteries and periodically transferring data to and from my REST API in the Azure cloud.

My core mostly runs with wifi turned off to keep power usage low, and if its sensors detect an event, or a certain time has elapsed it connects to WiFi and HTTP POSTs some sensor values to my WebApi REST service in azure and also receives some configuration data in the HTTP response like “keep WiFi on so I can OTA flash some new firmware”.

I’m currently using the HTTPClient library on the core, but I’m finding it cumbersome and unreliable. I wonder if it might be more elegant if I rewrite my spark app using Spark.publish() and Spark.function() and use SignalR on the server to subscribe to the updates, but will this work on a core that spends most of its time disconnected from the spark cloud?

E.g. If I connect to the spark cloud and Spark.publish() some values and turn WiFi back off a few seconds later will this reset the web socket each time? E.g. Would the server have to be constantly polling, trying to subscribe to the core waiting for it to come online.

The same goes for sending data to my core. Can my cloud server call a Spark.Function() while my core is offline and expect the core to process the request automatically the next time it connects to the cloud?

I'm pretty sure that publishing from the Core just means that you're saying that the core has data or a function that can be reached via the REST API, not that you're actually pushing that data to the Spark Cloud, so I believe that the Core/Photon/Electron must be online and connected to the cloud for the REST API to actually access that data.

Yep, I think so.

I don't think so.

Your current design of publishing to another location then taking the Core offline is exactly how I would do it. If Azure pricing is too steep for you, you could use data.sparkfun.com.

Perhaps it’s best to have @Dave comment on this, since he’s also responsible for the API. He should be able to shed some more light on how it all works behind the scenes.

Thanks @naikrovek, your description makes sense and clears things up a lot for me. I think you saved me days of experimenting and finding out the hard way.

I checked out data.sparkfun.com and it looks cool but it doesn’t look like it will support my long term goals of apple push notifications and automated emails. I’m glad to know it’s there though so thanks for the heads up!

I’ll persevere with HTTPClient for now, @Dave please chime in if you think we’ve got it all wrong.

[quote=“megabyte, post:4, topic:10281”]I checked out data.sparkfun.com and it looks cool but it doesn’t look like it will support my long term goals of apple push notifications and automated emails.
[/quote]

True fact.

Heya @megabyte,

Sorry about the slow reply, this has been hanging out in my inbox for a while now! :slight_smile: I think I was waiting to reply because I wanted to suggest you could use spark.publish and webhooks

in case you hadn’t seen these yet:
http://docs.spark.io/webhooks/

Thanks!
David