Strategies for identifying a Spark Core used in a device

I understand that each Spark Core has a uid, and I can use that to identify it uniquely. But if I am using the Core in a device, how can I distinguish the Cores inside my devices from other Cores? I could use and query a Spark.variable, giving it a device/company specific name. But this can be easily faked by another Core, I think. Just wondering how others do it.

Thanks

If you use Spark.deviceID() then this will give you the long ID of your device, however there does not seem to be any easy way to get the name you give your device.

regards
kevin

I am envisioning that the client software would already have the device ID, which is returned by the GET /v1/devices when you first make a connection. I was thinking more about a device specific layer of identification on top. I guess a secret string which is accessible via a Spark.variable would do the job.

@electronut,

There’s a new API endpoint upgrade that allows passing the product id of some sort to the device you are talking about.

This is designed to help product creators and comes with other cool features as well. It’s pretty new but @Dave will be able to share more information.

You can always email sales@spark.io and talk to them about your requirement :wink:

1 Like

Thanks. Could you clarify what you mean by “API endpoint upgrade”? If I use the cloud compiler, am I pulling in these updates?

It’s has nothing to do with the firmware compiler I guess.

something to do with more parameters in the API. We need to wait for more details :smiley:

Hey All,

Good questions! This fits into our product creator flow! When you create a product with Spark, we give you a product_id that you can bake into your firmware which lets you send out automatic updates, and manage your products. We’re also building a suite of fleet management tools that help you manage your products and user experience. A lot of these features are being built and designed now, and we’ll be announcing a lot more over the coming weeks and months about it. :slight_smile:

The feature I added recently was for products/projects based on Spark that are open to all. This is for creators that want to let someone opt into their product ecosystem using a normal Core/Photon, and can be a way to support build at home projects. :slight_smile:

Thanks!
David

4 Likes

@Dave

these are really good news for the future !
Do you examined the option to have even firmware versioning management for (semi) automatic updates ?

Thanks

Claudio

Hi @duffo64,

Thanks! Some of the first features in fleet management will be to help you manage firmware deployed to your products, yes! :slight_smile:

Thanks,
David

Where can I find information about this feature?

Thanks

Hi @electronut,

We’re still working on these features and documentation for them, but if you’re interested in creating a product with Spark, please reach out to Dan at sales@spark.io, and he can help get you the resources you want. :slight_smile:

Thanks,
David

1 Like