Looking through a people’s project ideas, its seems likely that quite a few people will be using multiple cores to do the same thing. For example, multiple temperature sensors in different locations, all reporting info to a central computer or service.
Do you have any plans to have a “mail-merge” style feature in the spark cloud IDE, where we could deploy code to ten different devices with a different ID for each one?
eg: the arduino code:
int deviceID = [%DEVICE_ID%];
when deployed to my spark core #001 would become:
int deviceID = 001;
to spark core #002 would become:
int deviceID = 002;
And so on?
Failing that, will there be a way to read a unique identifier from the core?