Sending POST requests from the Core

I am very new to Spark and before I commit myself to use this in my project, I want to know if it is possible to do what I have in mind.

Here is what I am planning to do:

  1. Monitor a physical variable, like say, for example the water level of my water tank every minute. I want to log this information via a streaming service:
    This service allows me to post my variables to their cloud based system.

  2. In case the monitored variable goes beyond a certain value, I want an email/tweet to be sent out to me, as a warning.

Both 1 and 2 are easily achieved via python if I use a raspberry Pi hooked up to my sensor. But I want to achieve everything via the Spark. This keeps my project small and web enabled from the get-go.

The question is: is this doable.

I read the API documentation. I know I can GET and POST urls to my core. But how do I make my CORE itself do a POST request to the streaming service webserver?

And how do I get the core to send out a tweet? Are there libraries that I can use for this purpose?

Thanks for your patience in reading my post.
Have a good day.

@seemanta, since the Core cannot do https directly, you can now do GET/POSTs using webhooks. These allow you use use Spark.subcribe()/publish() to send or get data from any website to and from the Core. A neat new feature recently added is the ability to have the Cloud do a JSON parse on return data so you only get what you want from the payload, removing the need to do JSON parsing in the Core.


Can webhooks be used with the local cloud?

I’m looking at using the Photon for monitoring some things, but security is an issue for me. So I need to be able to push data from the photon in real time, but I can’t use the public cloud.


@aaltieri, the local cloud does not yet support webhooks. However, webhooks and all related data is secured like all other public Cloud transactions.

Is there an ETA? I can’t have these devices connected to a public network at all.


@aaltieri, perhaps @Dave can clarify. If you cannot connect to a public network, you may want to consider using TCP only. If you search the forum, you will find examples like this one:

Sorry, this might seem like a very stupid question, but what is a local cloud?

The “local cloud” is a version of the full Spark Cloud you can run on a local computer such as a Raspberry Pi for example. Though it does not support all the functions of the full Cloud, it can be used where internet connectivity to the Spark Cloud is not available or not desired. :smile:

This is very helpful, thanks!


1 Like

Jeez, I am such an idiot. Thanks for answering my questions patiently :slight_smile:
I appreciate it.

Hi @aaltieri,

If you need this to be entirely local only and want to use the local cloud, you can always add something like webhooks behavior to the local cloud quickly. I’m hoping to sketch out a better interface for extending the local cloud, but if you installed it / got your core connected to that locally, you could do something like…

// etc
if (lowername == "my/event") {
  var request = require('request');
  request({ url: "" ... });


This is Awesome! It will hopefully help me greatly.I’ll find out tonight. Will post in Cloud JSON Parsing or Here.

Has there been any update on webhook functionality being added to local cloud (spark-server)?

Haven’t found anything regarding this so here comes the mother of a bump:

Webhooks in the spark-server local cloud: Boy wouldn’t it be nice!

Will appreciatively accept trouts of wisdom slapped in my face about what i couldn’t find where.

Have you found any more information on spark-server local cloud webhooks?

You may want to look into @Brewskey’s heavily enhanced fork of the local cloud server

I really just need to do a PR to the main Particle repo so you don’t have to keep referencing our fork :slight_smile: