If i have a local Spark server (based on RPi) and three Spark Cores (each control a light) which is the easy way to send a general turn OFF - a variable (Spark Core nº1) to another two Spark Cores (nº2 and nº3)?
I use a Spark.variable to local cloud or i use the serial Communication to send the variable?
I don’t find a sample code to send a variable/order to another Spark Cores.
Note: Consider if i don’t have Internet, but all Spark Cores are programmed with a Token Code.
If you have a local cloud and three SparkCores hooked up to lights, you have a few options for the light switches:
Put some buttons on the digital IOs on each SparkCore and have them:
a. Shut themselves of fin local code.
b. Make a REST call to your local cloud to shut off any other lights
If you just want to go mobile, write an app for your tablet, PC, etc. that is connected to your local private network to make the REST calls to your Cores.
Maybe i can use a Spark Core to save all states of rest of Spark Cores in same network and then is possible to be read??
@cloris
My ideia is: Spark Core nº1 have a digital input when is Open needs to send a General turn OFF
Spark Core nº2 and nº3: receive the variable “0” open theirs relays.
So all Spark Cores need to communicates between they.
Is possible to do, with a local Spark cloud?
Yes. You will just have to hardcode the device IDs into your firmware for all three devices. When the button is pushed on one device, you will have to make the Cloud Code API call too all three deivces to an exposed function that will change the relay state.
So in a nutshell:
Expose function called SetRelay that takes a 1 or 0. This will either set the your digital output to High or Low - turning the light on or off.
Hook up a button to an interrupt on a port. When the button is pushed, it will toggle the state of a local variable that tracks if the lights are on or off. After toggling the state, it will push the new state with three distinct Cloud Code calls for each device to set state.
@jlamfaria, what @cloris defines is one great way of doing it. You can also do a more server-centric approach where a Spark.variable is polled by the server for each node and based on the states, would trigger a Spark.function() on each node based on the desired activity. This approach is considered “synchronous” since a change in a node is not “seen” until the server polls it. @cloris’s approach is considered “asynchronous” since the changing node notifies the server via a REST call immediately. It just depends on how you want to implement it
@peekay123 Good call on the server centric. That way the devices don’t need to know about each other and the server will keep track of them. @jlamfaria If you can wait for when they implement subscribe on local cloud, your life will be immensely simplified. Post some code in a bit and let us know how it is going.
Has spark.subscribe() been implemented for the local cloud yet? If not, could someone walk me through or point me to an example of how to use spark.variable() and spark.function() to store data from the Cores to a file directory on my local cloud/server?