Is there any way I can create a BLE network amongst an array of Argon peripherals so that I can receive data from all of them at once? I would like to use BLE as a fail-safe in case one or more of the devices is disconnected from WiFi, or in case the WiFi dies completely, so that the devices can still serve their functions.
There is no BLE Mesh support and you (currently) can only have three concurrent BLE connections.
But you can have a central scanning for BLE advertising packtes - either specific alert packets when connection was lost and/or heart beat packets to tell whether all devices are still alive.
But then you could also go with a WiFi heart beat monitoring scheme as BLE scanning ins notoriously slow and would problably keep the central Argon occupied for only that.
There is a library written by a Particle Solution Architect that provides for 1 central and 3 peripheral BLE devices in a group. It uses BLE to support a pub/sub relationship between devices. I have been able to build a BLE network (this is not BLE Mesh) using this library and it works pretty well. I am not going any further with this however because the limitation of 3 peripheral devices (due to memory) just would not support what I wanted to do - discussed with Particle and the memory issue is a device OS compile time thing and they don’t want to expand it and then reduce available memory. It is however a much simpler and hence more predictable network topology than mesh which can become very difficult with large numbers of end node devices. The link setup time is very quick although the bandwidth is only 55KB/s maximum with the most optimal message size. The topic link is here Library for creating a local group of devices using BLE
For my project, I’m essentially setting up an Argon in each classroom of a school campus to collect data, which is sent to the cloud and picked up by a Node-RED flow that uses the data to perform specific functions. If the WiFi goes down, I still want the Argons to be able to relay this data locally, so that users can still collect it and act from there.
So, if I were to somehow set up someone’s smartphone as a central device that scanned for advertising packets (using Node-RED), would they be able to pick up data from any Argons in range so that data could gradually be collected from all devices as the user walked towards each one?
Also, can you enlighten me on what a WiFi heartbeat monitoring scheme is? I’m still pretty new to this stuff!
This sounds like a great solution to my problem, but the bandwidth limit and memory issue are some pretty big downsides. Thank you for suggesting this to me! I’ll definitely try it out.
Edit: I went through the thread that you linked, and it was extremely interesting to read about the potential for more BLE connections. However, the topic of ‘sleeping’ Argons could potentially help my case, but the topic was quickly abandoned early on in the thread. If you can, can you explain this concept to me? Is it possible that a central device could have more than 3 connections if the peripherals only wake up when they need to have data transmitted? Or am I understanding this incorrectly?
That should be doable (although I can't vouch for Node-RED).
You'd have all your devices send out a short message via
Particle.publish() in regular intervals and have some server that subscribes to all of these event streams and monitors whether it does actually receive each of the individual "heartbeats" at the expected rate.
Once you notice one of the devices has stopped checking in you can issue a notice on that server to have that device resuscitated.
Yeah... apparently I can’t use Node-RED to do that unless I create a complete app that can take advantage of the Bluetooth capabilities of both iOS and Android.
Since two peripherals can’t communicate with each other without a central device as a medium, what if every other Argon was a central device and the rest were peripherals? Would it be possible to create a daisy chain of sorts, in which a central device would push its data to a peripheral, and that peripheral would push it to the next central, and so on and so on until the device at the end takes the single piece of data and stores it in an array? Then, each device would be able to transfer its data to the last device in a single file fashion, although it would be pretty slow and inefficient...
Also, how does an iBeacon work? Since it uses BLE, does it have the same range as if I were to just use regular BLE functionality? Or would I be able to ping an iPhone to go to a specific device, in case there was some critical data that needed to be tended to?
Also, the data I’m sending is either a single string or an ID number. It shouldn’t be too big to be sent, right?
I’m really just spitballing ideas here, because if the WiFi goes down, my project is pretty much useless. I just want to make sure it can still transfer the data it’s collecting in some way, so that it can still function.
A lot of things are thinkable and also doable, the question is are they worth the effort. With the fragmented view about your use-case I think I have collected, I’d say the daisy chaining is not something that’s worth the effort.
With my limited understanding of your use-case I’d pay more attention to outage notification and recovery than data transport.
You may also want to picture what “outage cases” you want to address, how likely they are and which is the most practical approach to tackle them all (or most of them). Once you have that you can estamate how much effort you want to put into tackling the individual scenarios.
One thing I could imagine worth considering may be a way to allow for neighbouring devices to remote-reset a “lost” device via BLE.
About the iBeacon range question: There is no difference as the iBeacon is not using a different radio than BLE, it’s just some defined kind of packet but still transmitted via BLE.
The central BLE device can only manage 3 peripheral devices in a group. There is a concept of group ID - I don’t think the central device can handle more than 1 group at a time.
I would second what @ScruffR has said about the architecture - best to use WiFi and have a publish queue to ensure that any events generated when offline get stored and sent when back on line. I have used both SD card memory, retained RAM and FRAM for this. Look for PublishQueueAsync_RK library.