Hey Guys
Please I need your help to start working on the NodeJS to create a server on my PC. This server will listen to an event from my spark core (As I believe by using GET), which published data in cloud. I’d like to know how can I write this code? Is these any sample code to start with?
Thanks in advance.
[Woah there on the pings. Give people time to respond before pinging so many people OR just ping 1 or two at the start if it’s really specific - @harrisonhjones]
Dear @harrisonhjones
I did steps 1&2. For step 3, what I understood is I should change the event name from (test) to be my event name. Is this right? How I’ll specify my device?
If I understood the idea of event stream, what is the next step? How can I start writing a code to read the event data, and save it to a text file on my PC?
Thank you so much.
Setup up a Particle Device to publish a message every 2 seconds or so. Ideally a JSON formatted message if you can
Try to use the event stream code provided in the docs to grab those messages (try console.log(...)ing them)
Post your code for 1. and 2. (don’t accidentally put your username, password, or access token when posting) in a Gist and report back with the link here on the forums and we’ll help
So you’ve got two different sets of code: firmware (on your Spark Core) and software (on your computer, for now).
If you can go to dashboard.particle.io and see your events appear every 2 seconds or so you have the firmware figured out. If not, you’ll need to write some code in the WebIDE/Particle Dev
The software (node JS code) will go into a new .js file somewhere which you will run with the node name-of-your-file.js command.
Throw the entirety of your node.js file on a Gist (minus your username,password, or access token) and link to it here. I’ll help you out.
Sure @harrisonhjones. I can see my event data in the dashboard each 2 or 3 seconds. Do I need to put my name.js file in the directory of the node folder? Or I can put it anywhere? The last thing that I want to know before start writing the file.js code is do I need to include something else other that the event stream and the following below:
park.listDevices(function(err, devices) {
var device = devices[0];
console.log('Device name: ' + device.name);
console.log('- connected?: ' + device.connected);
console.log('- variables: ' + device.variables);
console.log('- functions: ' + device.functions);
console.log('- version: ' + device.version);
console.log('- requires upgrade?: ' + device.requiresUpgrade);
});You can ask for a specific device by it's id with: spark.getDevice
spark.getDevice('DEVICE_ID', function(err, device) {
console.log('Device name: ' + device.name);
});
I tried to run it in the NodeJS command line after I put my CoreID with the user name and password, but I got this error:
C:\sparkJS\examples\node>node sparkjs
module.js:340
throw err;
^
Error: Cannot find module 'spark'
at Function.Module._resolveFilename (module.js:338:15)
at Function.Module._load (module.js:280:25)
at Module.require (module.js:364:17)
at require (module.js:380:17)
at Object. (C:\sparkJS\examples\node\sparkjs.js:5:13)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
Hello @harrisonhjones
Yes I downloaded the whole library as described in the doc.partoicle.io.
I did comment in //var spark = require('../../lib/spark'); and comment out var spark = require('spark');
I still have the following errors:
C:\SparkJS Project>node sparkjs
module.js:340
throw err;
^
Error: Cannot find module '../../lib/spark'
at Function.Module._resolveFilename (module.js:338:15)
at Function.Module._load (module.js:280:25)
at Module.require (module.js:364:17)
at require (module.js:380:17)
at Object. (C:\SparkJS Project\sparkjs.js:4:13)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
For the npm install spark, I did it in a separate folder and it is installed successfully without errors.
Thanks @harrisonhjones for your clear explanation. I got the same data that I'm seeing on the dashboard, but I don't know how can I save just the event data on the text file? Right now, I'm receiving:
Try adding console.log("Data: " + data.data); after console.log("Event: " + JSON.stringify(data)); and see what you get. If you get the data you were expecting then look into adding something like this:
var fs = require('fs');
fs.appendFile('myAwesomeLog.txt', data.data, function (err) {
console.log("There was an error writing the data to the log file");
});
var fs = require('fs');
fs.appendFile('myAwesomeLog.txt', data.data, function (err) {
console.log("There was an error writing the data to the log file");
});
I got the following errors:
C:\Spark>node spark-test.js
C:\Spark\spark-test.js:7
fs.appendFile('myAwesomeLog.txt', data.data, function (err) {
^
ReferenceError: data is not defined
at Object. (C:\Spark\spark-test.js:7:35)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:906:3
Your fs.appendFile(...) isn’t quite in the right place. I went ahead and forked your code and made edits. You’ll need to edit it to match your setup. Let me know how it goes