I understand that each Spark Core has a uid, and I can use that to identify it uniquely. But if I am using the Core in a device, how can I distinguish the Cores inside my devices from other Cores? I could use and query a Spark.variable, giving it a device/company specific name. But this can be easily faked by another Core, I think. Just wondering how others do it.
If you use Spark.deviceID() then this will give you the long ID of your device, however there does not seem to be any easy way to get the name you give your device.
I am envisioning that the client software would already have the device ID, which is returned by the GET /v1/devices when you first make a connection. I was thinking more about a device specific layer of identification on top. I guess a secret string which is accessible via a Spark.variable would do the job.
There’s a new API endpoint upgrade that allows passing the product id of some sort to the device you are talking about.
This is designed to help product creators and comes with other cool features as well. It’s pretty new but @Dave will be able to share more information.
You can always email sales@spark.io and talk to them about your requirement
Good questions! This fits into our product creator flow! When you create a product with Spark, we give you a product_id that you can bake into your firmware which lets you send out automatic updates, and manage your products. We’re also building a suite of fleet management tools that help you manage your products and user experience. A lot of these features are being built and designed now, and we’ll be announcing a lot more over the coming weeks and months about it.
The feature I added recently was for products/projects based on Spark that are open to all. This is for creators that want to let someone opt into their product ecosystem using a normal Core/Photon, and can be a way to support build at home projects.
We’re still working on these features and documentation for them, but if you’re interested in creating a product with Spark, please reach out to Dan at sales@spark.io, and he can help get you the resources you want.