Hi All!
I am trying to get my two spark data through Cloud, and I know i can do this by using the command twice.
Any good idea will be appreciate.
Thanks!
Hi All!
I am trying to get my two spark data through Cloud, and I know i can do this by using the command twice.
Any good idea will be appreciate.
Thanks!
I don’t know of a direct way to do that, but you could easily create something like this by using Spark.publish()
and Spark.subscribe()
in combination with a Spark.function()
.
If it was possible to do a Spark.publish()
from any other device but a Core, you could even do it without Spark.function()
.
Have both Cores subscribe to an event like ‘TellMe’ and whenever one of the Cores Spark.funktion()
is called this event should get published to notice all Cores to publish their information.
I hope you get the picture what I mean
@abue172, I believe (though I don’t have two cores to try it on :-() that by using the command line and just requesting a variable that both cores have without naming a core, it will fetch from all available cores. If you are instead trying to send data to a core however, I do believe you will need a function call for each core. Alternatively, you could set up an outside (bash, python, etc.) script to redirect a single command to multiple cores.
Thank you @mumblepins @ScruffR!!!
I will tried both method and will update result as soon as i can