I created a new tutorial for using Google Firebase and webhooks for storing data from Photons, Electrons, etc…
There are three examples in the tutorial:
Saving data in a table
Saving data in a per-device table
Saving data from a device and retrieving it from another device
You don’t need your own server; it’s all done with webhooks and the regular Google Firebase API. And if you don’t have much data to exchange, you may be able to use the free tier forever.
Can’t wait to get this up and running to compare cost against Azure services that I have been testing. It looks like this may be much, much cheaper.
I was not able to get the new Google Database Integration Particle announced a few weeks ago up and working for me but your solution looks better because it does not require Node as part of the setup process
How did you come across Google Firebase? Are you using it for something your working on? I’ve never heard of it
Used your tutorial, but I have a question. I was trying to store some variable data for use later. Using a PUT as in test3data I stored some data. The using a read webhook I accessed the data. The read was OK, but the data was cleared after the read in Firebase, why?
That shouldn’t happen. Make sure your read webhook is using a GET operation. You can also double check where the problem is occurring by doing the GET operation manually in a browser or using curl and using the Firebase database view and seeing if the value gets cleared.
Very strange. What are your event names? The event names are a prefix, so if you were to, for example, name the events “firebase” (for write) and “firebaseRead” the write webhook would get called for both. Just guessing here, because I’ve never seen what you’ve described happen. I’d double check the event log at console.particle.io to see if there’s anything suspicious showing up there, as well.
the mutableCopy will be all weird after parseObject. That’s why it makes a mutableCopy, the parseObject modifies the buffer for it own parsing process, so it wouldn’t surprise me if it looked like that after parsing.
Not sure if this is the right spot for this but I have most of your code from test2 verbatim in my program and it is working great except I am not getting the device name to come out in my webhook and into a separate level in the database (I just get “”).
My electron is going into deep sleep between PIR triggered events and only connects to the cloud every hour to upload a count of the events. I am using SYSTEM_THREAD(ENABLED) and SYSTEM_MODE(SEMI_AUTOMATIC).
I have the subscribe and publish in setup() which calls functions that turn on the cellular modem and connect to the cloud when the hourly publish happens.
I searched the forums a bit and it seemed like there may be some complications between subscribe, multi threading, and semi automatic but I had trouble finding a satisfactory explanation for my problem.
I added another example to the original tutorial that uses an Electron and sleep mode. It stores the device name in retained memory so it doesn’t need to be retrieved every time, even if you use deep sleep mode, saving data. It also better handles some common error conditions that you’ll run into when you do a combination of sleep and publish.
Yes rickkas7, I did use your code, and you are correct in that I printed the mutableCopy after the parse object. I think I was looking at the wrong thing. The reason for the question was because when I parse the json '{“zone2”:“0”,“zone3”:“1”}'
I get root[“zone2”]= “” and root[“zone3”] = ‘PU’ Seems to be a error in SparkJson. Thanks for all your help.
I think the problem is that since zone2 and zone3 are strings in the JSON sense, quoted values, you need to access them using the asString() method in the JSON library.