Hey there! We (myself and @julianozen ) are working on a hackathon project that has two spark cores working together to perform a set of functions. Specifically, each of the cores has different outputs that need to be coordinated. So ideally an abstract action sequence would look something like:
["SparkCoreA: set AnalogOut2 HIGH", "SparkCoreB: set AnalogOut6 HIGH", "SparkCoreA: set AnalogOut4 LOW", ...etc]
Is this doable in a single Spark App? Is there a way to specify which core you are giving an instruction to?
I looked into the publish and subscribe API functions, and I think we could set up some sort of interaction between the cores to coordinate who can handle which instruction. But it seems like that could end up being more complicated than we would need.
Alternatively I guess we could set up two separate Spark Apps for each core, with one being a master and one being a slave. They could then coordinate via HTTPRequests. However, the potential problem I could see with that is high latency (as well as being more complicated than the possibility of just controlling a specific core from an app in charge of both).
Thanks much! Also, thanks @avidan for the help thus far!