Using SparkJS in browser

I was never able to get Spark library streaming to work in browser Javascript. I do have it working without the Spark function. Here’s the code I used. It calls my function processPublishNotificationGroup() when events arrive. With any amount of TTL in your messages the Spark Cloud might return several Publish messages from your core in any one notification. I have my spark core put a message number in each Publish so that my code can keep track of events it has already handled.

startMonitoring = function () {

    logAdd("entered startMonitoring");

    var eventMonitor = new XMLHttpRequest();

    eventMonitor.onreadystatechange = function() {

        var data = eventMonitor.responseText;
        //console.log(data);

        if (data.length > 5) {

            processPublishNotificationGroup(data);

        }

    }

    var accessToken = window.spark.accessToken;

    eventMonitor.open("GET","https://api.spark.io/v1/devices/events/SISEvent?access_token=" + accessToken
    ,true);

    eventMonitor.send();

},

Thanks for sharing!
I’ve been using this in my MEAN application, to log them to my MongoDB database:

var requestObj = request({
    uri: 'https://api.spark.io/v1/devices/events?access_token=<<<ACCESSTOKEN HERE>>>',
    method: 'GET'
});

var processItem = function(arr) {
    var obj = {};
    for(var i = 0; i < arr.length; i++) {
        var line = arr[i];
 
        if (line.indexOf('event:') === 0) {
            obj.name = line.replace('event:', '').trim();
        }
        else if (line.indexOf('data:') === 0) {
            line = line.replace('data:', '');
            obj = extend(obj, JSON.parse(line));
        }
    }
// ======== saving to database, irrelevant for SSE =========
    var SSE = mongoose.model('Sselog', SselogSchema);

	//var dht = new Schema(Dhts);
	var objData = {
		name: obj.name,
		data: JSON.parse(obj.data),
		publishtime: obj.published_at,
		coreid: obj.coreid,				
	};
	
	var DataToBeSaved = new SSE(objData);
	DataToBeSaved.save();
//======== End of databse save =======================

	console.log(objData.data);
};

var chunks = [];
var appendToQueue = function(arr) {
    for(var i = 0; i < arr.length; i++) {
        var line = (arr[i] || '').trim();
        if (line === '') {
            continue;
			//console.log(arr);
        }
        chunks.push(line);
        if (line.indexOf('data:') === 0) {
            processItem(chunks);
            chunks = [];
        }
    }
};
 
var onData = function(data) { 
    var chunk = data.toString();
	//  console.log('DATA: \n' + data)
	//	console.log('chunk: \n' + chunk);
	//	console.log('chunk split: \n' + chunk.split('n'));
	//	console.log('==========================');
    appendToQueue(chunk.split('\n'));
};

requestObj.on('data', onData);

I blatantly stole this from the [CLI][1], so all credits to the guys at Spark again :wink: I was just hoping that the library would get fixed, since that would look a whole lot neater :blush:
[1]: https://github.com/spark/spark-cli/blob/master/js/commands/SubscribeCommand.js

1 Like

Hey All,

Just a heads up that a new version of SparkJS went out yesterday that fixes the getEventStream callback error.

Thanks!
David

3 Likes

Ok, fixed getEventStream for real this time in browsers, we should release a new version / build of this soon.

Thanks!
David

2 Likes

Hello,

I am new to the community (from France). Seems like I ran into the same kind of problem than what was discussed above: calling some functions from the Spark Javascript API inside a browser does not seem to work for some functions.

I manage to login on the Spark Cloud from a web page thanks to the


code snippet.

I can listDevices() but I cannot callfunction(). Javascript log console gives this:
https://api.spark.io/v1/devices/MY ACCESS TOKEN/changeLeds 400 (Bad Request).

The core blinks red like it’s going mad (found out in the documentation that it’s an SOS message :disappointed_relieved:) and then error message sent back is a time out : Error: Timed out. {stack: “Error: Timed out.↵ at Spark.normalizeErr (http:….jsdelivr.net/sparkjs/0.4.2/spark.min.js:1:29864)”, message: “Timed out.”}

Can the callFunction() API be used only on the server side?

Thanks.

Emmanuel

PS : I am really impressed with the products, the community… the creativity around all this. Congratulations to the Spark team and all the contributors.

Hi again,
I used the serial console on the core to check the callFunction() on the core side.
Looks like the call is performed ok. The issue comes from my handling the parameters on the function call on the core side.
So issue is on my side, not the JS Spark library.
Emmanuel

1 Like