Registering A Callback

The docs say callback functionality will be ready sometime in March.
Is that still the plan otherwise should we just rely on making TCP connections on the device?

Yep still the plan, it’s coming very soon. Already developed, in testing now. But in the meantime, try out Server-Sent Events!

2 Likes

Thanks for the fast response. I’m using Parse for my backend and looks like they don’t support Server-Sent events further more I’m working on a similar project to your pizza ordering button so the callback might be more practical, unless you have a better suggestion?

@zach i got really curious and read the blog a few times wonder about this…

It seems to me that the name for Server-sent-events can be almost anything a user wants.

What if and what if both user happens to be using the same name? eg. ā€˜me/temperature’

Sounds like it’s gonna be a rare occasion but i got really curious about it :smiley:

Also, let’s say my event TTL is 60s… And it expired before the event was triggered again… And someone published with a same ā€˜name’. Won’t i be getting data from that publish? :smile:

Sorry for the silly question cos i guess probably people would say… ā€˜use a unique and special name for your event’

Just wondering :stuck_out_tongue:

Good question!

It all depends on which endpoint your program is listening to:

e.g:

etc

So even if someone else is using the same event name, you can just filter to your cores, or a particular core. Also, any state persisted from a message you publish is stored only along with your core information, and is only accessible to you.

Thanks!
David

That sounds interesting!

So… Our events are probably all going under api.spark.io/v1/devices/events whether public or private?

Or anyone’s public event goes to ā€˜All Public Events’?

Just the public events appear in the ā€œAll Public Eventsā€ firehose, if your event is flagged private it will only show up for you in those feeds.

got it!

Isn’t that going to have some event name clashes in the public when the :spark: core community gets incredibly large which is going to happen? :smiley:

or… Let’s say 100 people uses ā€˜temperature’ public event… i can filter mine with api.spark.io/v1/devices/events/Temperature ?

1 Like

Hi @Dave

First off, this is great and works wonderfully!

I do have a question though–when I run a curl command to listen to the events, I get the event data as expected, but I also get occasional carriage returns between event data, sorta like a keep-alive is running. This is expected, right?

And to who ever is publishing ā€œTemperature/Deckā€, wow 3 degrees F is cold even by Boston standards!

@kennethlimcp – totally, api.spark.io/v1/devices/events/Temperature would filter to just events from your cores starting with "Temperature.

@bko - Thanks! That would be my deck, it is cold! Also, yes, the Carriage Returns are an intentional keepalive, they should come in every 9 seconds when there is no other traffic.

Ah i’m still really curious and having some hands-on makes me pick things up faster :smiley:

So @bko could you share how you do the curl command and how to you know about ā€˜Temperature/Deck’ events? :smiley:

I have a python script which uses the ā€˜requests’ module when i tried to help test the CFOD issue to perform HTTP requests.

Also, i modified it slightly to check my access_token and delete etc.

That’s like the most i ever did with python and API so i definitely need you veterans to start me off with new feature!:wink:

I’ll put up a Tutorial in return. haha!

Hi @kennethlimcp,

One thing that’s nice about these endpoints are that they use GET requests, so you can just open them in a browser window, if you’re in chrome you have to ā€œview sourceā€ on the page to see what’s coming across:

view-source:https://api.spark.io/v1/events/?access_token=your_access_token

@Dave THAT’s REALLY FUN :smiley:

Seriously i always loved microcontroller and the :spark: made it more fun!

1 Like

What happens if a user accidentally calls the Spark.publish() too frequently?

Hi @kennethlimcp

I just did what David suggested and used:

curl -H "Authorization: Bearer <<hex number here>>" https://api.spark.io/v1/events/Temperature

You to replace the entire <<hex number here>> part with your Auth Token.

Interestingly this finds events who names begin with Temperature like Temperature/Deck and Temperature/Basement (go in the basement David, it’s much warmer! :smile:) but it does not find spark-hq/Temperature. I think that is a good way for this to work with hierarchical levels.

1 Like

Then there will be a lot of events! :slight_smile:

:smiley: Is there something wrong here which causes a compile issue?

String data = 'ms';

void loop() {
    
        Spark.publish("Core_Uptime/Singapore", data);

Yup! in C++ / arduino coding land, 'a' single quotes denote a single character, and "abc" double quotes denote a string. So you want to change 'ms' to "ms"

I would recommend starting out trying to not publish more than say, 5-10 events a second, if you send too many too quickly you may knock your core offline :slight_smile: (I suspect it’s possible to flood yourself offline currently)

Hmm… Seems like a problem still.

Is this related to this issue?