Ok after understanding that the spark cloud is not meant for production units to use for customer data, I wanted some suggestions on methods for reading and writing to the Core. I will use JSON as the medium core communication to a HTML5 website or smartphone app.
I’m assuming one method is to use another cloud like google to send and receive data.
Or have a webpage that the Core can talk to directly.
What’s the pros and cons of each?
And how in general do you implement the methods in code on the core and webpage?
On the contrary! We hope people will use the cloud for production, controlling your devices, relaying information, flashing updates, and much much more!
Certain things (streaming music, sending many hundreds or thousands of requests a minute, very low latency local communications (like remote controlling a car)) might not always be a great fit for the cloud however. You can still use the Core/Photon for these, but sometimes it’s best to work over the local network, or through your own server as well as the Cloud.
You can consume our restful API easily with Javascript ( https://github.com/spark/sparkjs , http://docs.spark.io/api/ ), or any other language that can make web requests and parse JSON. You can also publish small JSON strings from your Core/Photon. My hope is that we can increase the publish limit to something much larger, to make it easier to throw JSON around.
If you want to send lots of data quickly, or make lots of requests in a very short period of time, I would recommend using TCP / UDP sockets with whatever service you want to consume.
If you design a Core product and don’t want to use the Core token to communicate with the Core, you can’t use the Spark Cloud for customer’s data. However, I do understand the Spark Cloud can be used to flash the firmware and do some management. But if you plan to sell 1,000s of Cores in your products, me, the designer, should keep control over the Core and retain the tokens and update them when they expire.
Good question! No, in this case you can use oauth 2.0, and our fleet management system to manage your products. You get your data, your customers get the access they need for the product, and you don’t have to manage a boatload of access_tokens.
We try to be very open and flexible, and we’re still writing documentation on a lot of this, I think if you emailed sales@spark.io they an can help answer more product creator questions like these in the meantime.