Publish events from the command line?

Is it possible to publish events from the command line? I’ve been searching, but can’t find the magic formula. I thought maybe I could just POST to something like https://api.spark.io/v1/events/myevent but I’m not sure what to send?

I want to have a core listing for events, and be able to generate them from various non-core sources. I suppose I could use a spark variable or function, but pubsub just felt right…

Don’t think this is available as of now :wink:

IFTTT is in beta, that can be used to create “actions” on a core from non-core sources.

Yeah, that’s actually what I wound up doing, for now. But since I have less control of the timing (practically none, really), and no error feedback, it isn’t ideal for the testing I’m trying to do.

I suppose I should just switch to a cloud function call for now, then flesh out pubsub later.

In the future, when the HAL (Hardware Abstraction Layer) is all setup and working, you should be able to create your own virtual devices which act just like cores or photons and thus can publish events, have variables, subscribe to events, etc

1 Like

Good point @harrisonhjones! I’d contemplated suggesting that but we are some way away from offering this as a solution. The code is working - we have a virtual core running, but the provisioning of device id/keys etc is not yet worked through.

Yes, that’s what I thought was the current situation. Is any kind of public/private beta of this planned? TBH I’d happily pay for a device/id pair :smile:

So, was the pubsub model intentionally designed with the idea that only a Spark Core can publish events? I’m just wondering what the limitation is preventing the addition of publish from spark-cli, for example.

Hey @dougal! Good question!

I wasn’t involved with the design, so can’t speak specifically about that, but I don’t believe there’s an expectation that only cores can publish events, but more that the solution hasn’t grown to encompass non-core devices yet.

I’m interested to know what kind of non-core devices will be publishing events?

In the meantime, you could expose a function “publish” that takes the name of an event to publish.

void publish(String msg) {
   Spark.publish(msg);
}

This will allow you to talk to one core to publish the event, and have multiple cores listening.

1 Like

At the moment, probably my laptop. :slight_smile: But it could be a web service, or any kind of internet connected device. Maybe some other non-Core arduino.

Just as an off-the-top-of-my-head possible use-case, I might have an iBeacon (or similar BTLE device) detect when I arrive home, and a gateway device (like my computer) could publish an event to the cloud. Then one or more Cores (or Photons, one day) could see those events, turn on lights, music, etc.

Again, I could probably do the same thing with cloud functions, but semantically, pubsub just “feels right”. :smile:

Mainly, right now, it would just be convenient to be able to publish via spark-cli.

1 Like

Hey All!

Actually, you can publish messages from the CLI with:

spark publish event_name event_contents

I have this particular feature protected with a permission right now because it’s not rate limited in the same way as on the cores. Rate limiting API requests to protect against a flood is harder than on the core, and once I’ve added it I’d like to release that feature so any user can publish events into the cloud. :slight_smile:

Thanks,
David

2 Likes

Looking forward to the eventual general availability to publish from other sources.

At the moment, I’m working on combining the MessageTorch with the Holiday Cheer Lights, and tweaking the light patterns. :slight_smile:

1 Like