Registering A Callback

The docs say callback functionality will be ready sometime in March.
Is that still the plan otherwise should we just rely on making TCP connections on the device?

Yep still the plan, itā€™s coming very soon. Already developed, in testing now. But in the meantime, try out Server-Sent Events!

2 Likes

Thanks for the fast response. Iā€™m using Parse for my backend and looks like they donā€™t support Server-Sent events further more Iā€™m working on a similar project to your pizza ordering button so the callback might be more practical, unless you have a better suggestion?

@zach i got really curious and read the blog a few times wonder about thisā€¦

It seems to me that the name for Server-sent-events can be almost anything a user wants.

What if and what if both user happens to be using the same name? eg. ā€˜me/temperatureā€™

Sounds like itā€™s gonna be a rare occasion but i got really curious about it :smiley:

Also, letā€™s say my event TTL is 60sā€¦ And it expired before the event was triggered againā€¦ And someone published with a same ā€˜nameā€™. Wonā€™t i be getting data from that publish? :smile:

Sorry for the silly question cos i guess probably people would sayā€¦ ā€˜use a unique and special name for your eventā€™

Just wondering :stuck_out_tongue:

Good question!

It all depends on which endpoint your program is listening to:

e.g:

etc

So even if someone else is using the same event name, you can just filter to your cores, or a particular core. Also, any state persisted from a message you publish is stored only along with your core information, and is only accessible to you.

Thanks!
David

That sounds interesting!

Soā€¦ Our events are probably all going under api.spark.io/v1/devices/events whether public or private?

Or anyoneā€™s public event goes to ā€˜All Public Eventsā€™?

Just the public events appear in the ā€œAll Public Eventsā€ firehose, if your event is flagged private it will only show up for you in those feeds.

got it!

Isnā€™t that going to have some event name clashes in the public when the :spark: core community gets incredibly large which is going to happen? :smiley:

orā€¦ Letā€™s say 100 people uses ā€˜temperatureā€™ public eventā€¦ i can filter mine with api.spark.io/v1/devices/events/Temperature ?

1 Like

Hi @Dave

First off, this is great and works wonderfully!

I do have a question thoughā€“when I run a curl command to listen to the events, I get the event data as expected, but I also get occasional carriage returns between event data, sorta like a keep-alive is running. This is expected, right?

And to who ever is publishing ā€œTemperature/Deckā€, wow 3 degrees F is cold even by Boston standards!

@kennethlimcp ā€“ totally, api.spark.io/v1/devices/events/Temperature would filter to just events from your cores starting with "Temperature.

@bko - Thanks! That would be my deck, it is cold! Also, yes, the Carriage Returns are an intentional keepalive, they should come in every 9 seconds when there is no other traffic.

Ah iā€™m still really curious and having some hands-on makes me pick things up faster :smiley:

So @bko could you share how you do the curl command and how to you know about ā€˜Temperature/Deckā€™ events? :smiley:

I have a python script which uses the ā€˜requestsā€™ module when i tried to help test the CFOD issue to perform HTTP requests.

Also, i modified it slightly to check my access_token and delete etc.

Thatā€™s like the most i ever did with python and API so i definitely need you veterans to start me off with new feature!:wink:

Iā€™ll put up a Tutorial in return. haha!

Hi @kennethlimcp,

One thing thatā€™s nice about these endpoints are that they use GET requests, so you can just open them in a browser window, if youā€™re in chrome you have to ā€œview sourceā€ on the page to see whatā€™s coming across:

view-source:https://api.spark.io/v1/events/?access_token=your_access_token

@Dave THATā€™s REALLY FUN :smiley:

Seriously i always loved microcontroller and the :spark: made it more fun!

1 Like

What happens if a user accidentally calls the Spark.publish() too frequently?

Hi @kennethlimcp

I just did what David suggested and used:

curl -H "Authorization: Bearer <<hex number here>>" https://api.spark.io/v1/events/Temperature

You to replace the entire <<hex number here>> part with your Auth Token.

Interestingly this finds events who names begin with Temperature like Temperature/Deck and Temperature/Basement (go in the basement David, itā€™s much warmer! :smile:) but it does not find spark-hq/Temperature. I think that is a good way for this to work with hierarchical levels.

1 Like

Then there will be a lot of events! :slight_smile:

:smiley: Is there something wrong here which causes a compile issue?

String data = 'ms';

void loop() {
    
        Spark.publish("Core_Uptime/Singapore", data);

Yup! in C++ / arduino coding land, 'a' single quotes denote a single character, and "abc" double quotes denote a string. So you want to change 'ms' to "ms"

I would recommend starting out trying to not publish more than say, 5-10 events a second, if you send too many too quickly you may knock your core offline :slight_smile: (I suspect itā€™s possible to flood yourself offline currently)

Hmmā€¦ Seems like a problem still.

Is this related to this issue?