Architecture Advice for commercial use

Hi All,

I’m hoping for some advice for a commercial use of particle devices. We already have a commercial architecture using particle systems, our current offering collects data from particle devices via google PUB SUB and then is transferred to AWS, dashboarded and presented to our customers via a web app. This product only collects and dashboards data from devices for our customers.

Our new commercial offering will include the ability for customers to read and write variables on their devices as well as the current dashboarding services we offer.

I’m just wondering what type of architectures people are using in commercial use cases for calling particle functions and updating variables on customers particles. Whether they use AWS lambda’s or otherwise.

I would also be interested to know how people handle the setup and activation of particles for customers. Currently we are handling this and pushing this back to our customers would improve our scalability greatly.

Thanks in advance, hope everyone is keeping safe.

Hi Micheál,

I prefer to use the Two-Legged Authentication method.
My favorite platform today is Google Firebase, so I send all the data into Firebase. From there, a web app deployed in Google Cloud or a mobile app in iOS or Android can access all the data.

For calling functions or getting variables, you can make your customers login in Firebase (using firebase auth is great). Once validated, the customer can get a Particle token that grants access to their devices only (if they claimed them to their account).
From there, your web or mobile apps make https requests to the Particle API directly and can interact with their devices.

Something like this:

Make sure to check, double check and understand the full two-legged authorization flow. It’s long and perhaps dry but needed. See here.

Your web and mobile apps can share the same source code if you write them in a hybrid framework (one example is Ionic Framework, among others like React or Flutter).

Your mobile app can do that for wifi and cellular devices. Your web app can do that for cellular devices, but it’s a bit more complicated to claim wifi devices from a web app (HTTPS => HTTP issue here).

NOTE: one use case for cellular devices that I hear is recommended by Particle is NOT TO CLAIM the devices under your customers accounts. Just keep the devices unclaimed or claimed by your particle account.



Great explanation @gusgonnet. I have a similar setup but choose to use Particle webbooks to Azure functions to digest the data and events coming from the devices. I use Azure ADB2C for user authentication to my web app and through a web app the user can claim a device and then make API calls to particle given their customer token that is assigned to them from Particle. The database, web app, active directory B2C, and functions are all in the Azure ecosystem. Only real reason I wanted to use Azure was we use it quite a bit in my day job and if I was going to learn anything new I figured I mine as well learn something that could benefit me there as well. I’m quite sure AWS and/or Google are just as good of choices.

I wonder the rationale to this or why wouldn’t someone want to have the users claim the devices. In the next few months I’ll reach out to Particle directly and see if I should change my strategy for users claiming their devices.

Thanks for the info!


Azure is a good choice as well. I used and continue to use it in some projects, but for new projects I try to use Firebase as I feel more at home with it.


perhaps @rickkas7 can give an explanation here? thanks!

1 Like

Thanks @gusgonnet @jgskarda for the thorough suggestions.

I like the idea of setting up a 2 legged auth stack, the ability to capture data regarding when users log in, device use habits etc. seems very attractive and is definitely where we will want to go in the future.

But the simplicity of the simple auth approach seems ideal to quickly develop our MVP get some customers and prove out it’s value. This is particularly attractive as we essentially have one developer to build/maintain entire stack.

Not claiming a device simplifies the setup process, and is more common for cellular devices. There is one important caveat to this, however:

Unclaimed product devices cannot subscribe to events. This means they also cannot receive webhook responses, or find their own device name.

Other than that, unclaimed product devices can receive function calls, variable requests, and OTA.

The reason for the various claiming processes is for Wi-Fi devices in particular, you often will need set Wi-Fi credentials, and it’s common to do so with a mobile app. By claiming a device to a customer account, such as using two-legged authentication, the mobile app can not only communicate with the device but also communicate with the Particle cloud directly on behalf of the device.

For cellular product devices, it may be more convenient to have your own web app, or if you do have a mobile app, use your own framework (React, Ionic, Flutter, etc.) that does not have Particle API-specific code in it. By not claiming the device, you basically just communicate with your own backend using non-Particle API-specific code, which may be easier to find developers for.

1 Like

In the pros and cons, would this be another cons in not claiming cellular devices? As follows:

  • when a customer gets a scoped token, they can control their devices and any device in the product.

As when a customer claimed few devices, they can get a token that only controls their devices and their devices only.


1 Like

Typically when using unclaimed devices the customer will never get a token. You’ll handle that entirely within your own back-end, so there is no worry about customers accessing each others devices.

1 Like

Hi, it may not be all roses and sunflowers in the land of simple auth.
There may be few things that can cost you even more than in double legged auth, as the reset passwod flow emails as discused some time ago here (to which Particle provided a solution):

If that is not an issue for your MVP, then the better! I am not trying to stop you, just giving you the info. As a reference, I found that issue out only once I was deep into the simple auth for one product, and then in all subsequent products I used double legged auth and I do not recall it being a lot more work.



Interesting… I was not aware of this. I send device settings back to the device using Pub/Sub to change sleep durations, charging configuration for solar, etc. I’ll have to test this out as I start with devices being “unclaimed” but still publishing events. I never noticed that it didn’t get a response but not sure if I ever explicitly tested it as most times I just claim it to my personal demo account before I start testing. I suppose the workaround for this is just claim them all to a single admin type account rather than leaving them unclaimed.

Yeah, that makes sense. I was debating this when I started. Didn’t put too much thought into it and just went with the Web App making API calls directly to Particle. Only real reason was it’s the path i went down first, it worked so I kept going. This makes sense though as it could make an API call to my back-end and then have the back-end communicate to Particle. Thanks for the explanation.

1 Like

yup, I suppose the same.

1 Like

Oh good stuff to know! So in other words, unclaimed devices will not be controlled directly from a web/mobile app, but requests will go through ones back end.
Thank you again, great explanation.

Thanks for your detailed responses so far @gusgonnet. I am convinced that I should go with full two-legged auth with a web-app implementation. Now the fun part begins, :rofl:.

So from a high level what I am planning is to use the particle pub/sub integration to transfer event data from particles to firestore. Users will be able to view the data from their devices after authorising themselves via our webapp.

Here is where my ignorance and inexperience might shine. I know that I will have to utilise some sort of server, but am at a loss as to where to start. Can a fully scaleable solution be built using firebase functions as a server to query and display users data from firestore and also call the particle api to execute particle functions and update variables on their registered devices.

Also are you aware of any quickstart examples that could bootstrap me towards having a functional backend that can query particles as per auth user? I’m a recent comp sci graduate and so have no indepth experience with any particular framework, and so hoping to get as much direction as I can from the community without being spoonfed, haha. Anyway thanks everyone for the guidance so far.

This can be done with your web app. This web app can be hosted in Firestore and will have access to the Firebase database (if the user is authenticated). From the web app, you can send HTTPS requests to the particle cloud to get variables and call functions on devices.
The web app will get executed on your customers browsers (once the customer hits your web app link the code gets downloaded to the browser) and from there your app will do the requests.


1 Like

Hey @gusgonnet, thanks for your guidance earlier.

As you advised, I have now built the skeleton of a two legged auth flow using firebase and a react web app.

Currently I have a web app that would allow customers to signup and their details are saved into a Firestore db. At the same time as their details are saved to the db, I create a shadow account for the user on the particle product console.

I will probably have a similar setup flow to jgskarda as per this thread Initial device setup for products - 10-20 borons.

My initial idea is for the customer to signup via our web app which will create a shadow customer. Then when the customer logs in to our app, they will manually claim the device by entering the device ID and submitting this to the Particle API. In your opinion should I get my express backend to make this claim or is it ok to claim from the web app? It seems like it’s reasonably standard practice to use the access_token on the front end. If this is the case, then I assume it’s probably ok to call the claiming api via the web app also.

Would it also be overkill to generate a new access_token for customers every time they login to our web app? Should I just try to create a scheduled function to refresh access_tokens for all customers?

Again thank you and everyone else for your advice. I’m amazed that I have been able to mock up a skeleton that I could potentially use commercially in a couple of days. I couldn’t do this without this forum, so thanks all.

Do you also create an account for this particular customer using Firebase authentication? That’ll be ideal.

technically is ok.

However there is a difficulty to overcome if your product is WIFI based. HTTPS sites (your web app) CANNOT call HTTP sites (like the soft ap running on the photon). Hence if your product is photon based, that’s going to be a difficult one. I overcame this in the past with a mobile app.

If it’s cellular based, your customer or your backend can claim them.

not overkill. No need for a scheduled function.

hey way to go and keep it up!

1 Like

I’m no expert in this area on what’s best but I’ll tell you what I did:

When a user logs in, I also make a call to the Particle API from my backend to generate a Particle token. This token is unique to that user and is then saved in their local cache. This is then used to make any API calls directly from the front end web app to call functions on the device.

When a user claims a device, they enter a unique serial number on the device into a form and submit that form to my backend. My backend then reads the token from their cache, obtains the particle Device ID from my SQL database and then the backend does the API call to particle to claim the device to that user. I also then update my own SQL indicating that device is claimed and to that user.

That said, I would think it’s fine to make the API call from the front end. The main reason I did it from the backend is I didn’t want to expose the Particle Device ID to the user. Instead, I create my own Serial Number for this. I may also have devices that are not particle in the future so doing it from my backend seemed easier as I could look that up in SQL table.

I personally generate a new one each time they log in. Although the remember me is set to every 30 days or so. Unless they log-out it’s not very often that new tokens are generated.

I assume this is very similar to Azure ADB2C (i.e. almost like active directory). Or this is Googles equivalent to Azure ADB2C? I.e. no storing hashed PWs or having to deal with reset emails, etc.? If so, yeah. Definitely worth doing it with cloud based authentication vs having to deal with all that stuff yourself. I originally did it in my own SQL and sent reset emails through my backend. Migrated to Azure ADB2C authentication and wow… so simple and really cleaned things up.

yeah, I think it is Google’s equivalent, so reset emails, security, and all that stuff is managed by them, and not us.

1 Like