Hey everyone. I am trying to decide what I am going to use for this project that I am working on. Spark.publish or http post. I was just wondering what the major differences are and what are the advantages and disadvantages of each. Help is greatly appreciated!
This kind of question might be too open to actually make people wanting to answer
If you provide some more detail of what you want, you might get the answers that help you most.
One major difference: The one needs the cloud (Particle or privat/local) and the other doesnât.
Ok, well I was hoping this would help people in the future who want to look this up to. But here we go. I am currently using spark.publish to send 19 different strings of data. It publishes a different string about every 1 second. (so it sends the first string one second, the second string the next second, the third string the third second and so on, and then repeats after publishing the 19th string of data) I was wondering if it would be better to use http post. Like maybe I can send all the information in one if I did that? I donât know.
If you are considering webhooks, there seems to be some issues with using a publish/subscribe approach as dealt with in this post. Publish doesnât seem to always return. It is an issue seen on many servers so it appears that the problem is Particle-side.
The benefits are great because you are more succinctly parsing your results⌠so there is the benefit if that gets âfixedâ in future updates.
Right now, I am not using publish/subscribe for projects that are more time-critical. For example, Iâm getting my present email count from gmail⌠I just want it to be near-to real time so I poll the server every 30 seconds for an update. If I use publish/subscribe, it irks me that it is not returning reliably enough to keep my device current. Accomplishing this with an http request returns reliably nearly 100% of the time, in my observations. The downside is that the Spark Core that is using this is constantly flooded with data using the return from gMailâs Atom feed. If I was doing much more with the device, I may have concerns about that.
On the other hand, I am using weather underground for a device that uses sunrise and sunset. Using publish on that webhook once an hour. That seems to be reliable enough that I capture the times at least once in 24 daily tries!
Depending on how big the data strings are, you could concatenate them into bigger strings, and parse them on the receiving end. If Iâm not mistaken, you can send 255 bytes of data using a Spark.publish(). What kind of data are you sending?
The Photon doesnât (yet?) support HTTPS, so if you need that to transfer your message, you may way to use webhooks, and thus Spark.publish()
I see - so you are struggling with the 63byte limit for publish
on the Core, but the Photon has got 255 (I think).
HTTP/TCP/UDP doesnât have this limit and also no âspeed limitâ.
Oh - I just left to make a cup-a-tea and got overtaken be two rather capable guys
the 255 byte limit is fine for me. So should I be good with doing that (note: I donât know why I wasnât just sending all of the data at one time before).
oh, is it 63 bytes? that might be a problem
The docs state they both support 255 bytes of data, with the name being limited to 63 bytes. You should be good to go on either
OK, so, just to recap. http post is more reliable but also floods the spark core with data? I did not really understand the whole thing about how the spark core gets flooded with data.
No sorry, in reference to a request for returned data, ie a webhook sorry not to be more clear.
ah so http post is more reliable when using webhooks then?
Iâd say more reliable âthanâ webhooks⌠today.
Summary:
HTTP: Device -> [receiving end]
Publish: Device -> Particle cloud -> the internet <- [receiving end] picks that up
Webhooks: Device -> Particle cloud -> [receiving end]
When you say server what server exactly are you talking about? Like the particle server or something like that?
I meant the thing youâre trying to send your data to. Updated accordingly
haha, ok. So, If my spark core is reading data from another device, then it publishes the data. Then, there is an html code that reads this data using JSON and sends it to a website. What is the âthing youâre trying to send your data toâ?
nvm, I think I got you. But, does this mean the http is faster since it just goes straight from the spark core to the server? Like what are the advantages of it going through the internet (or even the cloud)? Or are there any advantages?
It kinda always goes through the internet, doesn't it ;)?
The photon can't handle HTTPS connections (yet?) To do that, you can use webhooks. That way, you can send data to the cloud, which will in turn send that data over HTTPS to your destination.
A direct connections is the faster/less limited way I think, but might need more code than a simple publish. Depending on your needs, all three are valid options.
OK thank you, sorry I was being stupid with this stuff. Iâm still a beginner I guess haha