As I look to deploy core’s to the field, I am starting to think about deployment issues:
- Which version of Spark firmware code is a device running?
- What pcb board is it using?
- other meta data
Is there a mechanism for burning this into permanent memory as I setup each device, so it is not overwritten when I flash the core with an update later? This could then be read into a Spark.variable at setup() so a specific device an be queried via the cloud api.