Looks like you’re missing #include "application.h"
I guess not...my code compiles well with Spark.publish("temperature, "25 C")
for example.
But replacing it with a string like:
String data = "25 C";
Spark.publish("temperature, data)
becomes an issue
Oh, my bad! I think there was a fix for this committed to core-firmware, but it hasn’t been rolled out to the build IDE, I thought that was already rolled out.
So it is just the difference between C strings and Arduino String objects. That change you mentioned is not in the compile-server2 or master branches yet, so you should try this until it is:
Spark.publish("Core_Uptime/Singapore", data.c_str());
which is what the new code does.
Thanks for the help! That pretty much explains why
My core accidentally spammed Spark.publish() and i could see the continuous stream of data
I shall leave it to shoutout
for the night!
Thanks for the cool feature! Ready to help test as more gets rollout
@zachary Should Publish be overloaded for String and enforce source rate limiting?
There’s a part of my education that seems to be missing–I can’t get this simple example to work. This example comes from one of the HTML5 SSE pages, expanded a bit to try and debug.
In the browser, this file loads fine and when I click the button I get Clicked… on the console followed by Opened!. I can also see in the network activity viewer related the spark event stream–bytes are flowing. But the message and onmessage callback functions are never called. I have tried Mac Chrome and Safari and Win7 Firefox.
Anybody have any thoughts on this? Thanks in advance!
<!DOCTYPE HTML>
<html>
<body>
Event: <span id="foo">Events will appear here.</span>
<br><br>
<button onclick="start()">Start</button>
<script type="text/javascript">
function start() {
document.getElementById("foo").innerHTML = "OK you clicked it...";
console.log("Clicked...");
var eventSource = new EventSource("https://api.spark.io/v1/events/?access_token=<<long hex number here>>");
eventSource.addEventListener('open', function(e) {
console.log("Opened!"); },false);
eventSource.addEventListener('error', function(e) {
console.log("Errored!"); },false);
eventSource.addEventListener('message', function(e) {
console.log("Enter Message Listener");
console.log(e.data);
}, false);
eventSource.onmessage = function(event) {
console.log("Enter onmessage");
console.log(event.data);
document.getElementById("foo").innerHTML = "Here it comes...";
document.getElementById("foo").innerHTML += event.data + "<br>";
};
}
</script>
</body>
</html>
Hmm, I wonder if it’s expecting a different event type than ‘onmessage’ ? I suspect it might be emitting events of type ‘event-name’, whatever follows the “event:” header in the stream.
Haha, why yes! In fact, per our current “slow down and keep things stable” process, both of those features are in RFC pull request stage.
OK I got this and @Dave was exactly right. You need to have a listener for the exact event type you defined in your core such “Temperature” or “spark-hq/motion”. It would still be nice to learn to listen to the entire fire-hose of public events.
Here’s my HTML–the Temperature events don’t come as fast as some others so be patient.
<!DOCTYPE HTML>
<html>
<body>
Event: <span id="foo">Events will appear here.</span>
<br><br>
<button onclick="start()">Start</button>
<script type="text/javascript">
function start() {
document.getElementById("foo").innerHTML = "OK you clicked it...";
console.log("Clicked...");
var eventSource = new EventSource("https://api.spark.io/v1/events/?access_token=<<hex number here>>");
eventSource.addEventListener('open', function(e) {
console.log("Opened!"); },false);
eventSource.addEventListener('error', function(e) {
console.log("Errored!"); },false);
eventSource.addEventListener('Temperature', function(e) {
console.log("Enter event listener");
console.log(event.data);
document.getElementById("foo").innerHTML = "Here it comes...";
document.getElementById("foo").innerHTML += event.data + "<br>";
}, false);
}
</script>
</body>
</html>
Ok. et voila…bko solutions works perfectly. Yesterday, I tried to make it works with Node.js, but without any results…wells, since it is javascript, let use the same code!!
Node.js/Spark.publish() simple example:
Spark code:
void setup() { }
void loop() {
Spark.publish("something","welcomesomething");
delay(30000);
}
Node.js
First step:
npm install eventsource
and the code:
var EventSource = require('eventsource');
var es = new EventSource('https://api.spark.io/v1/events/?access_token={{YOUR_ACCESS_TOKEN}}');
es.addEventListener('something', function(event){
var eventParse=JSON.parse(event.data);
console.log(eventParse.data);
},false);
Results:
welcomesomething // every 30 second
I confirm @Playagood node.js example is working great !!
@Dave you said something about :
I had my spark core publishing every 2 seconds, after a while.. I notice it was not publishing anything, I had to restart the core. I think this doesn't make event-publishing infallible, for standalone devices, where there will be a need of manual restarting frequently.
Here’s a screen shot–I have been running the core sending temperature events once a minute since last night with no hiccups. I will likely take the “Connect” button off soon and just make it start automatically. You can see I am using a local file for the HTML. I also tried putting this simple HTML file on Dropbox and opening it on my phone, which worked great too.
Can you share your code?
Here you go–replace <<core hex id>>
and <<your access token>>
with your personal hex numbers. You might have to change “Temperature” and other parts too depending on what event you are sending.
<!DOCTYPE HTML>
<html>
<body>
<span id="temp"></span><br>
<span id="tstamp"></span>
<br><br>
<button onclick="start()">Connect</button>
<script type="text/javascript">
function start() {
document.getElementById("temp").innerHTML = "Waiting for data...";
var eventSource = new EventSource("https://api.spark.io/v1/devices/<<core hex id>>/events/?access_token=<<your access token>>");
eventSource.addEventListener('open', function(e) {
console.log("Opened!"); },false);
eventSource.addEventListener('error', function(e) {
console.log("Errored!"); },false);
eventSource.addEventListener('Temperature', function(e) {
var parsedData = JSON.parse(e.data);
var tempSpan = document.getElementById("temp");
var tsSpan = document.getElementById("tstamp");
tempSpan.innerHTML = "Temperature: " + parsedData.data + "° F";
tempSpan.style.fontSize = "28px";
tsSpan.innerHTML = "At timestamp " + parsedData.published_at;
tsSpan.style.fontSize = "9px";
}, false);
}
</script>
</body>
</html>
Thanks it is working !!
So…how much longer until callbacks are set up?
@Dave, what’s the latest timing? I’m guessing this is going to be pushed back a couple of weeks, as other issues were pushed up in priority.
Just want to mention that Uptime/Singapore
or something with a /
in general in the publish name will conflict with the API call since it’s like calling:
https://api.spark.io/v1/events/Uptime/Singapore/?access_token=""
I’m not sure how this will be fixed but yeah
I am not sure I understand–the hierarchical names like “Uptime/Singapore” are part of the design and the fact that they are part of the URL is a good thing. @Dave is publishing “Temperature/Deck” and “Temperature/Basement” for instance, so you can get either one or both (using curl with just Temperature).
If you are worried about a specific core’s events, you can add devices to the URL and even make the event privately published.