node-red-contrib-particle is a fork off @kvarma's original work done with node-red-contrib-sparkcore.
I've modified the code to provide an alternate, 3-node setup that uses configurations nodes. This gives a clear visual feedback of your nodes / Particle devices in node-RED:
The implementation of configuration nodes means multiple instances of the nodes can tap into the same cloud configuration set, saving time in building your node-RED flows:
I have tested this on a RPi local cloud, and barring the current issues the local spark-server has, these nodes should allow you to subscribe to SSEs, call functions and retrieve variables. If someone tests these on the Particle.io cloud, please provide feedback here on whether it works (or not). Time delays might have to be manually increased if you have to connect to Particle.io, ymmv.
Here's the original discussion related to using Spark Cores w/node-RED: