I am new to HTTP REST and webhooks so forgive me for my ignorance…
I found a tutorial online that will allow me to use Postman to send data to a table on SAP HANA Cloud Platform (similar concept to the Microsoft Azure platform). The tutorial works just fine with static, fictitious data. I would like to use data from my Particle Core instead.
Can I send data to the SAP Platform with a web hook or a HTTP POST? If so, where does that logic live? Do I put this “POST” logic into the Web IDE and then flash my device and it will communicate to the SAP Platform directly? Or will this logic be created in a web hook that lives on the Particle Cloud, meaning the core will send data to Particle Cloud, then to SAP?
I read the webhook tutorial but I ended up more confused…
Webhooks are the way to go, basically its the particle cloud doing the HTTP POST. The core/photon will publish an event, this event triggers the webhook which does the http post. Its all built in and reduces the amount of code significantly and adds security.
logic will live on the core/photon, in that the sensor data will require some logic to format the event data.
the webhook needs to have some logic too, to know how to process the data in the event so it matches whats required for HANA
have you got a link to the first tutorial you mentioned?
1.) CLI installed
2.) A .json file with all the information for your webhook
3.) For POST, you will need something like requestType: 'POST' in the .json file as shown in the example.
Thank you for the help. I have the CLI installed from the node.js installation and the .json file can be created in notepad (I assume?). I saw an example listed for librato on the Webhook guide webpage. The example uses a username and password. However, the tutorial I found for Postman uses “bearer” for authorization. Would I replace the username and password with “bearer”? I saw a small example of “bearer” used in the header section but it did not provide an example within an entire POST template. Additionally, the tutorial specifies a “Content-Type”. Where would this fit in?
I removed the postman link you shared since it contains all the credentials needed to access your account (that said i was able to test and get it working with the credentials )
Thank you so much for your help!
This is great! I will address the timestamp challenge shortly. First, I will continue to play with the webhook to better understand how this all works. Much appreciated!!
@kennethlimcp
I am having some trouble putting in a variable from the core. In the webIDE I used Spark.publish("hana_") and it successfully called the webhook. Now I would like to include data in the Spark.publish function. I looked at some other posts and the data is sent as a string. Example is from another post:
I am assuming the sprintf is creating a string called publishString with the hour, min, sec variables replacing the 3 %u's. The string is then sent to the "Uptime" webhook. I do not understand what the JSON file looks like to receive the string with 3 variables. In the JSON file you shared with me, the values were hard coded. If i wanted to send the temperature value of 75 from the core, I assume the IDE code would be as follows:
Spark.publish("hana_", "{"Temperature":75}";
What would the JSON file look like for receiving the temperature value with the webhook call? Currently the JSON file has a numeric value and not a placeholder for a variable. i.e. "Temperature": "80.2"
I also have the webhook with the material ready to talk to another cloud "Artik.Cloud", the webhook works. As of now, I have fixed values as place holders in my webhook, but is time to move to variables published by particle core, and I have the same questions you have above.
In contrast with what you had, I am publishing an array of integers so my string can extend