Spark Core and Node-RED integration

@ScruffR I knew you would come to the rescue :+1:

I didn’t leave any of the Particle.publish code I tried in there since it didn’t work and I wanted to ideally just ask for help and start over and quit wasting what little spare time I have at the moment. I looked at the docs and it only confused me more. Just saying I tried before asking for help.

So breaking down your code real quick here is how I interpret it:

// both readings in one event
  Particle.publish("rwbAmbient", String::format("t=%d;h=%d\%", temp_f, humidity));

Particle.publish sets up the publish even with the cloud.

“rwbAmbient” is the name of the published event that I will need to listen for when trying to receive the published events.

String the next line is where we indicate the data format? And it looks like this is where we can format the data before its sent out also which is what the format("t=%d;h=%d\%", temp_f, humidity)); line does.

So I added your code in the main loop. So this should be working now I’m assuming.

// This #include statement was automatically added by the Spark IDE.
#include "SHT1x/SHT1x.h"

// Specify data and clock connections and instantiate SHT1x object
#define dataPin  D0
#define clockPin D1
SHT1x sht1x(dataPin, clockPin);

int temp_f;
int humidity;

int LED = D7;

void setup()
{
    pinMode(LED, OUTPUT);
    
    Serial.begin(9600); // Open serial connection to report values to host
    
    Serial.println("Starting up");
    
   

}

void loop()
{
    //float temp_f;
    //float humidity;
    
    // Read values from the sensor
    
    temp_f = sht1x.readTemperatureF();
    humidity = sht1x.readHumidity();
    
    Particle.publish("Ambient", String::format("t=%d;h=%d\%", temp_f, humidity));
    
    // Print the values to the serial port
    Serial.print("Temperature: ");
    Serial.print(temp_f, 1);
    Serial.print("F. Humidity: ");
    Serial.print(humidity);
    Serial.println("%");
    
    digitalWrite(LED, HIGH);
    delay(500);
    digitalWrite(LED, LOW);
    delay(500);
    digitalWrite(LED, HIGH);
    delay(500);
    digitalWrite(LED, LOW);
    
    delay(2000);
}

I’m trying to get these published events to show up using the Spark Node in Node Red but I can’t seem to get it working. Below is a screen shot of the settings I’m trying to use to pull in these Published events.

I’ve played with the Name and Parameter boxes by leaving one blank and the other one filled out but I still get no output in the debug window.

What is the easiest way to test that this Particle.publish function is spitting this temp and humidity data out?

I haven’t played with Node Red, but to check your publish events have a try with these sites
http://suda.github.io/particle-web-interface/
http://jordymoors.nl/interface/

And with your interpretation of my code line you’re spot on.

1 Like

@RWB, I checked my Bluemix and is working for me. I used a test firmware and Node-RED flow to read a variable every 10 seconds and is working fine for me. Here is the Node-RED flow and Photon Firmware.

int num;

void setup() {
    num = 0;
    
    Particle.variable("numvar", &num, INT);
}

void loop() {
    num++;
    
    delay(3000);
}

@RWB, can you check your debug console and see what is printing there?

@ScruffR @krvarma

Ok so I used this link to see that the Photon code is indeed sending the Particle.publish data out. http://jordymoors.nl/interface/

So the data is going out just fine. Now I have to figure out what exactly I’m doing wrong with the Spark Node in Node Red.

@krvarma I’m going to run your Particle.variable test code now and see what happens and report back.

You can see I’m now trying to figure out how to capture the Particle.publish event using your Spark node.

So how should I setup the node if I want to capture this Particle.publish event:

 Particle.publish("TempSensor1", String::format("Temp=%d;Humidity=%d\%", temp_f, humidity));

@RWB, I checked all the three method, calling a function, reading a variable and subscribe, all are working fine for me.

OK. I uploaded @krvarma variable test code and its working based on @Moors7 's site which is very helpful, see below:

I figured out what the problem was :blush:

I was copying the device token from the Particle online programming page but I was not paying attention enough to see that the Token had expired :frowning:

I updated the token and all is good now!!! :spark:

Thank you for all helping me work on this. @ScruffR @krvarma @Moors7

Now I can move on to getting data into MongoDB and then displayed in graphs using Highcharts.

2 Likes

Have you thought of a smart structure to organize your data in in MongoDB? Even though it's really fast, data can rapidly grow, and having to go through all records to find the one you're looking for can quickly become unmanageable. Have you checked out the "bucket" approach, of storing 60 minutes worth of data within one hour? Quite possibly, seconds within those minutes. That should drastically reduce the amount of stuff you have to go through. You can then also use that data to make an average for said timeframes, and store that within the parent. Seconds can be averaged in the minutes, and those in the our. That way, once you progress in time, you can start to get rid of the more higher resolution data (seconds, then minutes), using the averages instead.

If you haven't given that a thought yet, please do. If you have, please share :wink:

@Moors7 I’m just getting started with figuring all this out so no I have not given to much thought about how to actually minimize the database size over time. Now I’m just learning how to take advantage to the Particle cloud features like Publish and Subscribe ect…

Node Red is really suprising me with how easy it makes it to direct the data from the Photon’s to just about anywhere you want.

After looking at the pricing for the Database GB sizes I can quickly see that keeping data size as small as possible can save you big time when it comes to monthly hosting fees.

I did read about the “Bucket” approach of storing data like you are mentioning somewhere but don’t remember where just yet. It does sound like a good idea though so do you have any links that goes over this “Bucket” approach?

What I’m doing is adding the Photon & Electron modules to some products we sell so we and the customers can view the product performance overtime and live. We can also send maintenance reminders via email or SMS messages using the Twilio node which is really nice.

Eventually we will have 100’s to thousands of units in the field so the data has the potential to get pretty large depending on how often we update the status info, like 1 time per min. I’m trying to keep the cost for all this to a minimum so I’m trying to eliminate all the middle men as much as possible.

I’m envisioning taking data fields and displaying that data on a custom HTML5 Website so it’s updating live and that data is coming from the MongoDB. I’ll need a custom login feature so people can log in to view their devices only, I have not figured that out just yet.

From what I can tell MongoDB is a pretty good platform to go with so I’m hoping I make the right decision starting with them.

Any feedback or advice on the above is more than welcomed :smiley:

@RWB looks like I woke up and missed the entire conversation and resolution here! Glad it worked out for you.

Something you might want to take note of when using the node-RED objects from either @krvarma’s or mine – if you’re using my node-red-contrib-particle, I have found that for local clouds, you can only create a maximum of 5 (or was it 4?) different SSE nodes to listen in to a single cloud. The entire flow will be disrupted once you get past that limit. Can other folks test this and see if they get the same issue?

@Moors7 suggestion of using mongodb buckets is great – I’ve been reading up on these links:
http://learnmongodbthehardway.com/schema/chapter2/
http://learnmongodbthehardway.com/schema/chapter6/

@chuank I’m not using a local cloud setup so have you ever heard about this issue if hosted on the net?

Also thanks for the links about the MongoDB buckets, I’ll be sure to read up on this :smile:

How are you using the Photon & MongoDB? They look like they are a very powerful combo.

@RWB, looking at your event data, I think I've bodged my code a bit. The result shouldn't contain the event name again, but a percent sign.
I used "\%" , but for printf formats it's "%%" to create an percent sign.
So the correct code should rather be

Particle.publish("rwbAmbient", String::format("t=%d;h=%d%%", temp_f, humidity));

Please give this a try, since I have no Photon at hand to test it myself.

2 Likes

@ScruffR I was wondering why it was throwing that in there at first. I had no idea the “/%” was throwing that in there. I modified your publish code a little to change the output format for now.

I know for sure I can make the code and database more compact in the future after I learn some more about MongoDB best practices.

I spent about 2 hours last night trying to figure out how to get the Time.timeStr(); inserted at the end of the line so each data entry has a time stamp.

I was able to get this to work in a separate publish event but I want it in along with the Temp & Humidity readings.

 Particle.publish("Timenow",  Time.timeStr());

Here is the current code I’m using to send data to MongoDB, how can I get that time into the publish event below?

Particle.publish("TempSensor1", String::format("Temp=%d; Humidity=%d, Time=%d, \%", temp_f, humidity));

The format() function can’t deal with String objects, which get returned by functions like Time.timeStr() so you need to convert them into a C string by use of the class method String::c.str().
And in printf format strings the placeholder for a C string is "%s".

So a properly formatted publish would look like this

Particle.publish("TempSensor1", String::format("Temp=%dF; Humidity=%d%%; Time=%s", temp_f, humidity, Time.timeStr().c_str()));

This might be some interesting read for you, if you want to make most of the printf formatting
http://www.cplusplus.com/reference/cstdio/printf/

2 Likes

@ScruffR Thank you so much for the help! :spark:

I will for sure read over your link to the printf function since it will for sure help me get data formatted before sending it into the database.

I added your code plus the time zone setting Time.zone(-5); and its working perfectly :smiley:

When I push this event to MongoDB I only save the Payload not the Core ID or Time in Zulu.

1 Like

When I switched to local cloud for my own projects, I never fully tested my code contributions with setups using the particle.io cloud. I ought to, but I’ll probably do that once the local cloud is given attention from particle and that it’s worth the effort to do the update.

node-red’s the one service that has stayed relatively stable and is wonderful to use with Photons+local esp. for near real-time data transmission.

Sounds good, although I'd personally just subscribe to the same event stream on the client. That way, your server doesn't have to serve the live updates for somethings that's accessible already. So, request all current data from the database, and then do the live view for subscription in the browser :slight_smile: Saves some server resources.

@Moors7 Yea that makes perfect sense. I’m new to all this so I have tons to learn when it comes to getting all this data live into a nice graph on a website.

Is javascript normally used to do all this? I’m certainly going to need to do lots of digging to find the path of least resistance and high efficiency.

If you have any recommended resources on doing stuff like this then I would love to look it over.

I cannot see any Spark/Particle modules in the editor. I installed NP and Node Red an they both work, then installed (at least ten times) the Spark library and nothing. What am I doing wrong?

Problemo solved: ‘npm install -g <library>’ instead of ‘npm install <library>