Spark to Spark communication for simple digital input-output transmission

Hi all, first post here in the forum! I’m looking to use Spark for a fairly simple project, which essentially involves using one Spark to take two mechanical relay outputs, and transmit them over Wi-Fi to another Spark which will then provide outputs that I can operate via a relay shield for the end device. It’s very much like a wireless switch, but with an analog switch at the input side.

This requires that the two Spark units be able to address each other via Wi-Fi. I understand that they need to be on my Wi-Fi network, which is fine, but I’m not sure how Spark would handle a connection between two units like this. For example, I’ve seen that it’s not super simple to give a Spark its own static IP address, so is there a host name or some other network identification method I could use in the code to tell the Spark on the input side to send its information to the Spark on the output side? (This will be purely one-way communication for now at least.) Do I need to use MAC addresses?

I’ve done some coding before, but I’m definitely more of a hardware guy and I haven’t really done any microcontroller programming of this type so I’d appreciate some guidance, thanks!

Hi @dbsoundman

I would look at Spark.publish() and Spark.subscribe(). The doc is here:

http://docs.spark.io/firmware/#data-and-control-spark-publish

You will get near-real time control since the conversation goes from core 1 to cloud and then from cloud back down to core 2.

There is a small bug with Spark.subscribe that has been fixed in the source code but is not in production yet so that currently you get any events you subscribe to with that name, not just your device’s events or a specific core’s events. The fix is coming to the production server very soon.

Thanks for the response! My only concern is in this application I would like it to work whether or not I have internet access, so I’d like it to operate locally instead of via the cloud. Is this possible?

Use UDP datagrams.

There are issues currently with UDP on the Spark but it’s usable for your purpose. I would send one byte datagrams to get over some of the problems, maybe ‘1’ for on, ‘0’ for off. One way comms? I would also keep on sending the desired status as you cannot rely on a UDP datagram being received, although that isn’t a Spark problem, that’s inherent to UDP.

Pending documentation updates see https://community.spark.io/t/udp-received-dgram-boundaries-lost-read-parsepacket-available-all-broken/3800 for a discussion about UDP issues.