What web service am I missing?: URL-data collection-Twillio-Particle.io Webhook-device

Hello Particle Community

Longtime lurker, first time poster - your patience is welcomed.

I’m in the very early stages of turning an idea into a proof of concept, but before I sink the time I’d like some feedback on my proposed work flow.

My coding skills are limited to R and excel, but I’ve read up on MQTT, IFTTT, Webhooks, API, JSON etc etc and feeling a little overwhelmed with so many options.

My project in a sentence:
If value of data stream is greater than 500, send sms to customer for approval, if response is 'Yes", write DO high.

I think this is whats required:
site (data stream) -> data scrap -> Twillio -> end user -> Twillio -> Particle.io (webhook) -> Argon

Data stream - column G, is updated every 5 minutes via a csv in a zip.
http://nemweb.com.au/Reports/Current/Dispatchprices_PRE_AP/

Unfortunately there isn’t an API for the data stream so I’m still working through how to scrap the csv.

Advise on whether my flow makes sense and is logical would be appreciated.

Thanks
Bill

Hello,

This sounds interesting, however I’m trying to understand all of the pieces and the bigger picture of what you are planning.

Here are my current assumptions:

  • the data is coming some “somewhere else” … not associated with Particle or hardware?
  • you can code grabbing the data, zipping it, and uploading it to your server?
  • you are looking to do this processing (checking the data for a value > 500) someplace in the cloud?
  • you are going to write some code to unzip and analyze the Excel column G?
  • you are looking to control something using D0 (digital pin 0) by setting it high?

You’re going to need something in the cloud (or someplace) that is going to broker the logic of this control. The Particle Rules Engine could do this quite well, or you could use Node-RED running elsewhere. Likewise, you could use Twilio Functions to perform some of this logic. You could also write this in a native language that is able to perform REST operations.

It seems that you could easily implement a Particle.function() that could be used to set D0 high or low. That would have a REST endpoint that could be called from any of the platforms above.

If all of the above assumptions are accurate, you can easily write some code to do what you want. I’m not sure that your data format (zipped Excel spreadsheet) is the most flexible format to work with, however can be used. Is this on-track so far?

Scott

Hi Scott

Thanks for the prompt reply. Based on your comments my main question is whether I need my own server? Not an issue if I do, just something else to read up on.

Responses:

  • Yes. Third party releases data every 5 min as a new zip/csv
  • Data will need to be unzipped, but not stored beyond that 5 minute period
  • Aside from a server I’m not clear where else the processing could occur?
  • I was hoping there would be an alternative to writing code to scrape the data
  • Yes, control signal is sent when logic is true. Any digital output pin is fine, this appears to be the easy part.

It’t unclear to me how much of this can be managed by the Particle cloud and how much will reside in Twillio and a separate server.

The data is from our national electricity market dispatch system, which I have no control over.

I think my question is how much of this can the particle cloud perform and whats needed to perform the rest.

Thanks again

I don’t know if a library (like libzip or JUnzip) exists for Particle to unzip the data on device, hence you’d probably need some kind of server that will do that for your, but once you have that, you could do most the preprocessing (filtering, extracting, interptreting, …) on that server already and your Argon could be a mere data sink (e.g. receiving a Particle.function() call or a Particle.subscribe() handler invocation).

1 Like

Thanks for your comments.

I suspected a server would needed to do the extraction and processing but was hoping someone may have an alternative - cloud service of some sort.

Seems like using the particle for the IO is the simple part.

1 Like

I also found this JUnzip library (link added in my previous post) which might be a bit slimmer, but I haven’t looked into the implementation yet.

If this is portable your files seem small enough to be unzipped on board.

This is the article about that lib

1 Like

Great, I’ll have a read of that reference - much appreciated.

If you choose to use Particle Rules Engine, or Node-RED, you could use the following flow:

Click here to view the Flow JSON to copy and import ...

[{"id":"54579d69.6b7924","type":"tab","label":"Flow 2","disabled":false,"info":""},{"id":"299695e2.f4d9ea","type":"inject","z":"54579d69.6b7924","name":"make request","topic":"","payload":"","payloadType":"date","repeat":"","crontab":"","once":false,"x":110,"y":40,"wires":[["9d6ace9.255e63"]]},{"id":"9d6ace9.255e63","type":"http request","z":"54579d69.6b7924","name":"","method":"GET","ret":"txt","url":"http://nemweb.com.au/Reports/Current/Dispatchprices_PRE_AP/","tls":"","x":130,"y":100,"wires":[["cc7aa9a2.ff44a8"]]},{"id":"cc7aa9a2.ff44a8","type":"html","z":"54579d69.6b7924","name":"","property":"","outproperty":"payload","tag":"[href]:last-of-type","ret":"text","as":"single","x":190,"y":160,"wires":[["9252116d.56a9f"]]},{"id":"c201f919.6ee6a8","type":"http request","z":"54579d69.6b7924","name":"Fetch .zip","method":"GET","ret":"bin","url":"","tls":"","x":440,"y":40,"wires":[["bdb7a498.a9ac78"]]},{"id":"9252116d.56a9f","type":"template","z":"54579d69.6b7924","name":"Create URL","field":"url","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"http://nemweb.com.au/Reports/Current/Dispatchprices_PRE_AP/{{payload.0}}","output":"str","x":210,"y":220,"wires":[["c201f919.6ee6a8"]]},{"id":"bdb7a498.a9ac78","type":"zip","z":"54579d69.6b7924","name":"Unzip","mode":"decompress","filename":"","outasstring":true,"x":470,"y":100,"wires":[["8926f847.bed168"]]},{"id":"21167e4c.bb3472","type":"csv","z":"54579d69.6b7924","name":"","sep":",","hdrin":false,"hdrout":"","multi":"one","ret":"\\n","temp":"a,b,c,d,e,f,g,h,i,j,k,l,m,n,o","skip":"2","x":590,"y":220,"wires":[["a844234a.09cf6","6d4c1ad1.17c144"]]},{"id":"4241022e.60c40c","type":"debug","z":"54579d69.6b7924","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","x":850,"y":260,"wires":[]},{"id":"4edc2efb.54fc","type":"debug","z":"54579d69.6b7924","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","x":850,"y":300,"wires":[]},{"id":"a844234a.09cf6","type":"debug","z":"54579d69.6b7924","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","x":850,"y":220,"wires":[]},{"id":"8926f847.bed168","type":"function","z":"54579d69.6b7924","name":"grab first file contents","func":"msg.payload = msg.payload[0].payload.toString();\nreturn msg;\n","outputs":1,"noerr":0,"x":580,"y":160,"wires":[["21167e4c.bb3472","1c65a60a.1f744a"]]},{"id":"1c65a60a.1f744a","type":"debug","z":"54579d69.6b7924","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","x":850,"y":160,"wires":[]},{"id":"6d4c1ad1.17c144","type":"function","z":"54579d69.6b7924","name":"Check g for > 500","func":"//\n// we only want to look at row with \"a\":\"D\"\n//\nif (msg.payload.a !== \"D\") {\n return;\n}\n\n//\n// are we greater than 500?\n//\nif (msg.payload.g > 500) {\n msg.payload = \"g > 500\";\n return [ msg, null ];\n} else {\n msg.payload = \"g <= 500\";\n return [ null, msg ];\n}\n","outputs":2,"noerr":0,"x":630,"y":280,"wires":[["4241022e.60c40c"],["4edc2efb.54fc"]]}]

If you load the flow, it looks like this (a quick hack to show you):

The flow starts in the top left with an Inject node. It can be configured to run every 5 minutes. Currently, it’s fired when you click it’s button. The inject triggers the HTTP Request node to fetch the web page you provided, and the HTML node uses a selector to grab the last file name on the page.

This file name is passed to a Template node that creates the URL from the filename, and then passes this to another HTTP Request node to fetch the .zip file. The Unzip node (I had to install this free node) unzips the file and returns an array of files contained in it. I then added a function to grab the contents of the unzipped file, and pass it to a CSV Parse node, which emits the parsed rows of the CSV as objects.

My last function node checks the “g” property of each object for “g > 500” and emits a debug message you can see in the Debug panel.

Obviously this was a quick hack, but only took ~15-20 minutes of experimenting to get to working. Instead of the debug messages, you could be calling the HTTP endpoint of a Particle.function() to set your digital pin to high/low.

Node-RED is a powerful platform for quick flow programming tasks like this, which is why Particle embraced it for their Rules Engine.

Now … if I can only get into that Rule Engine beta … :slight_smile:

2 Likes