Please need help to start working with NodeJS

Hey Guys
Please I need your help to start working on the NodeJS to create a server on my PC. This server will listen to an event from my spark core (As I believe by using GET), which published data in cloud. I’d like to know how can I write this code? Is these any sample code to start with?
Thanks in advance.

[Woah there on the pings. Give people time to respond before pinging so many people OR just ping 1 or two at the start if it’s really specific - @harrisonhjones]

Let’s start with this:

  1. Install Node.JS
  2. Install the Javascript SDK
  3. Look at the event stream documentation
  4. Report back with findings
4 Likes

Thanks

Dear @harrisonhjones
I did steps 1&2. For step 3, what I understood is I should change the event name from (test) to be my event name. Is this right? How I’ll specify my device?
If I understood the idea of event stream, what is the next step? How can I start writing a code to read the event data, and save it to a text file on my PC?
Thank you so much.

Steps:

  1. Setup up a Particle Device to publish a message every 2 seconds or so. Ideally a JSON formatted message if you can
  2. Try to use the event stream code provided in the docs to grab those messages (try console.log(...)ing them)
  3. Post your code for 1. and 2. (don’t accidentally put your username, password, or access token when posting) in a Gist and report back with the link here on the forums and we’ll help
4 Likes

Hello @harrisonhjones

For step 1, I've already written a code in my Spark Core to publish an ascii data coming from sensors. Is this what you mean by this step?

Try to use the event stream code provided in the docs to grab those messages (try console.log(...)ing them)

For Step 2, do you mean this code?

//Get test event for specific device
spark.getEventStream('test', 'DEVICE_ID', function(data) {
console.log("Event: " + data);
});

Where I should write this example? in Particle Dev, or in a new (.js) file?

Do I need to include these at the beginning of my code?

Thanks.

park.listDevices(function(err, devices) {
var device = devices[0];

console.log('Device name: ' + device.name);
console.log('- connected?: ' + device.connected);
console.log('- variables: ' + device.variables);
console.log('- functions: ' + device.functions);
console.log('- version: ' + device.version);
console.log('- requires upgrade?: ' + device.requiresUpgrade);
});You can ask for a specific device by it's id with: spark.getDevice
spark.getDevice('DEVICE_ID', function(err, device) {
console.log('Device name: ' + device.name);
});

1 Like

So you’ve got two different sets of code: firmware (on your Spark Core) and software (on your computer, for now).

If you can go to dashboard.particle.io and see your events appear every 2 seconds or so you have the firmware figured out. If not, you’ll need to write some code in the WebIDE/Particle Dev

The software (node JS code) will go into a new .js file somewhere which you will run with the node name-of-your-file.js command.

Throw the entirety of your node.js file on a Gist (minus your username,password, or access token) and link to it here. I’ll help you out.

And yes, I do mean that part for the event stream

2 Likes

Sure @harrisonhjones. I can see my event data in the dashboard each 2 or 3 seconds. Do I need to put my name.js file in the directory of the node folder? Or I can put it anywhere? The last thing that I want to know before start writing the file.js code is do I need to include something else other that the event stream and the following below:

park.listDevices(function(err, devices) {
var device = devices[0];

console.log('Device name: ' + device.name);
console.log('- connected?: ' + device.connected);
console.log('- variables: ' + device.variables);
console.log('- functions: ' + device.functions);
console.log('- version: ' + device.version);
console.log('- requires upgrade?: ' + device.requiresUpgrade);
});You can ask for a specific device by it's id with: spark.getDevice
spark.getDevice('DEVICE_ID', function(err, device) {
console.log('Device name: ' + device.name);
});

Thanks a lot.
Ahmed

1 Like

Take a look at this: https://github.com/spark/sparkjs/blob/master/examples/node/get-event-stream.js

It can go anywhere. You just need to open up a terminal / command prompt and run node name-of-file.js

2 Likes

Hey @harrisonhjones

I tried to run it in the NodeJS command line after I put my CoreID with the user name and password, but I got this error:

C:\sparkJS\examples\node>node sparkjs

module.js:340
throw err;
^
Error: Cannot find module 'spark'
at Function.Module._resolveFilename (module.js:338:15)
at Function.Module._load (module.js:280:25)
at Module.require (module.js:364:17)
at require (module.js:380:17)
at Object. (C:\sparkJS\examples\node\sparkjs.js:5:13)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)

Hi @Ahmedsa1983

Have you seen the doc page for the Javascript SDK? This explains how to install it for node.

http://docs.particle.io/core/javascript/

1 Like

ah. Did you download the entire library/SDK? If so, comment in //var spark = require('../../lib/spark'); and comment out var spark = require('spark');

In the future, your project files should be somewhere else and then you can run npm install spark and it’ll work!

2 Likes

Yes @bko
I saw the documentation and installed the node.
Thanks

Hello @harrisonhjones
Yes I downloaded the whole library as described in the doc.partoicle.io.
I did comment in //var spark = require('../../lib/spark'); and comment out var spark = require('spark');
I still have the following errors:

C:\SparkJS Project>node sparkjs

module.js:340
throw err;
^
Error: Cannot find module '../../lib/spark'
at Function.Module._resolveFilename (module.js:338:15)
at Function.Module._load (module.js:280:25)
at Module.require (module.js:364:17)
at require (module.js:380:17)
at Object. (C:\SparkJS Project\sparkjs.js:4:13)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)

For the npm install spark, I did it in a separate folder and it is installed successfully without errors.

Any Suggestion please?
@bko @harrisonhjones

Let’s try a “clean slate”.

  1. Go to a brand new folder, open up a command prompt, and issue the following command: npm install spark
  2. In that same folder copy, create a new file called “spark-test.js”, and paste your original (without the comment out thing) code (ie, this)
  3. Open up a command prompt and then issue this command node spark-test.js
  4. Report back!
1 Like

Thanks @harrisonhjones for your clear explanation. I got the same data that I'm seeing on the dashboard, but I don't know how can I save just the event data on the text file? Right now, I'm receiving:

Event: {"data":"SA0001FC00B00000000","ttl":"1","published_at":"2015-06-26T22:04:
07.803Z","coreid":"00000000000000000","name":"MyEvent"}

How can I just save the MyEvent data, which is SA0001FC00B00000000, to a text file?

Thanks.

Try adding console.log("Data: " + data.data); after console.log("Event: " + JSON.stringify(data)); and see what you get. If you get the data you were expecting then look into adding something like this:

var fs = require('fs'); 
fs.appendFile('myAwesomeLog.txt', data.data, function (err) {
    console.log("There was an error writing the data to the log file");
});
1 Like

@harrisonhjones
I got the following:

Event: {"data":"SA000FE000007B00000000C00400000E","ttl":"1","published_at":"2015
-06-26T22:12:55.937Z","coreid":"0000000000000000","name":"MyEvent"}
Data: SA000FE000007B00000000C00400000E

Is there another way to display only the data of the MyEvent?
Thanks

Also when I tried to add these:

var fs = require('fs'); 
fs.appendFile('myAwesomeLog.txt', data.data, function (err) {
    console.log("There was an error writing the data to the log file");
});

I got the following errors:

C:\Spark>node spark-test.js

C:\Spark\spark-test.js:7
fs.appendFile('myAwesomeLog.txt', data.data, function (err) {
^
ReferenceError: data is not defined
at Object. (C:\Spark\spark-test.js:7:35)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:906:3

Your fs.appendFile(...) isn’t quite in the right place. I went ahead and forked your code and made edits. You’ll need to edit it to match your setup. Let me know how it goes