Spark.getVariable vs Spark.subscribe


#1

Hi all

I’m building a web app and am interested in what peoples experiences have been with using a GET approach vs an event listener from the perspective of the web app.
I started off using Spark.getVariable from the JS library (I know this should read Particle.getVariable, but I’m not sure if it’s been deprecated yet).
Anyway, I’m using the JS setTimeout function to call the getVariable routine on a bunch of variables about every 10 seconds. I recall setTimeout at the end of the GET routine so that I’m not potentially queuing up requests if for any reason it takes longer than 10 seconds to fetch all my variables.
This all works pretty well until I take the device offline. When this happens I would ideally like the web app to handle things gracefully and alert the user that the device has gone offline as soon as ‘device.connected’ is able to tell that it has.
However right now as soon as the device goes offline I get a whole lot of queued up GET requests which can take quite a while before they return a 408.
For some reason after numerous 408’s I don’t think the Spark.listDevices call works anymore, which pretty much means that my webpage more or less just grinds to a halt after I take a device offline.
I had considered switching to the ‘push’ methodology of just using spark.subscribe (and Particle.publish on the device). However it seems a bit of a waste to be pushing data onto the internet when the web page may only be used once a week for example.

Any ideas would be great :smile:


#2

Although it’s a bit redundant, you could first make a listdevices call to check if your device is online, before you make a GET for the variables. Unfortunately, it can take up to a minute before the cloud notices the device is offline (after which it will tell you so in a SSE).
That said, if shouldn’t hurt too much to use SSE. They’re there for a reason, and this seems like a valid use case, so why not use them? Second benefit is that it’s pushed, so more real-time, and one push can serve multiple clients, whereas you’d normally had to make multiple GETs.


#3

Thanks @Moors7 all good points there.
I had actually been using listdevices in exactly this way, the problem is, as you say by the time the cloud notices the device is offline, I’ve already done 10’s of GET requests!
I suppose I’ll just have a go using SSE, one thing that put me off a little is that you can’t push integers (I know you can convert from string to int and vice versa) but it just didn’t seem quite as concise in way.


#4

I see that you’re requesting multiple variables. How about making one (JSON?) variables in which you combine all others? That way you can extract all the data you need at once, thus leaving you with fewer GET requests to make. Might be worthwhile. The same is true for the SSEs, in which it’s not a bad idea to combine data into a single publish.


#5

Thanks @Moors7

It’s a good idea, I hadn’t really thought about combining everything but I agree from a network perspective it does sound better. I followed the tutorial by bko Spark Publish With JSON Data which was super helpful.
One thing I’m not quite sure on, how do I ‘unsubscribe’ from an event listener using the JS library. I notice the Particle firmware seems to have provision for this but the JS library or cloud API don’t.
For me there’s a use case when a customer selects another device to monitor and I’ll need to ‘spark.subscribe’ to another device. In this case won’t the old event listener stay open?