Architecture insight

Have a working proto type of remote valve operator. Open, close, state and flow detected….

Not a fan of the architecture I am using and looking for other ideas, especially with Ledger and logic in Beta. Today’s architecture uses particle variables and webhooks to execute and extract data from the Boron, Electron…. All works great. To expose this to the end user there is a light sail instance that is vending an index.html that also has a css and a JavaScript file. That JS file has the code that is executed on a button push if the passcode is correct…. It creates a new Particle device, then calls particle.callfunction (…). In this case it will call into the API and fire a relay on the particle board.

Here are the issues with my architecture as I see them.

JS is client side and accessing a data sets for “passcode” seems less secure.
Today, the JS has a hardcoded passcode in it, to scale these need to come from a DB or data store.
The files, index.html, Index.css and index.js are all on a Wordpress site, just dropped a static page out there. It seems like I could use WP rest calls to make calls from the WP server into the particle cloud instead of using client side code. I am by no means a JS programmer so there is a lot of trial and error in that but it does work. The JS file has my AUTH token, my device id…. Seems like a poor architecture to me.

Long term there are multiple users, each user has access to 1 or more end devices.
As an example
User 1 User 2 and User 3 can have access to devices (boron) 1-10, 11-20 and 21-30 respectively
Admin 1 has access to 1-20
Admin 2 has access to 21-30
Super user has access to all device

Looking for insight from people who have built a multi device deployment:
Did you use Node as your intermediate code, bridge to the particle cloud?
Wordpress anyone?
Did you do custom back office system in Java, C…. That does all the business logic?

JS code….

Var particle = new Particle(); …

Cycle.addEventListener("click", function (){
console.log("in test3");
var tmpCode = document.getElementById("passcode").value
console.log("Code typed in %s", tmpCode);
if ( tmpCode == 1234 ){
name.style.color = "Purple";
setTimeout(resetColor, 9000);
//console.log("Passcode matches Issuing Relay to ID : %s, Auth %s", ID, AUTH);
particle.callFunction({ deviceId: ID, name: 'Relay', argument:'cycle', auth: AUTH});
} else {
name.style.color = "red";
setTimeout(resetColor, 2000);
}

Hi, you can get some inspiration from the not recommended anymore two legged authentication.

The new guidelines are here:

Cloud and stack choices are personal, and from my experience the easiest to work with has been Firestore, compared to AWS and Azure.

There's an integration:

Your website can be hosted on Firebase as well. Your users can use Firebase Auth, and the data can all be stored on Firestore.
Your users can hit firebase cloud functions and they in turn authenticate the user and hit the particle api.

They offer a generous free tier so your projects at first will not generate any costs, or barely.

Best.

1 Like

Checking the passcode in the client Javascript is not secure, and you should never do that. Among other things, by reading the client-side Javascript using the browser View Source you can just read the access token in the source and use it, no authentication required.

There are two solutions to this:

  • Use a server to manage authentication and make Particle API calls. Only the server has your Particle access token, which keeps it secure.
  • Use Particle customer accounts and have users log in using Particle authentication. This is secure in the browser because the checking and returning of the access token is done on the Particle cloud servers, not in the browser Javascript.

I would use the first. While using customer accounts sounds easier, managing your multiple levels of admin users will be hard to implement, and you still need a server in order to implement things like password reset for customer accounts.

For authentication, you can either implement your own on the server-side, probably with some sort of user database, or use one of the "sign in with" oAuth clients such as the ones for Google, Facebook, Amazon, X, Github, etc. depending on your expected customers. These are robust and eliminate the need for your customers to remember another password.

Your server will use this authentication combined with your own database of what this user has access to to determine what Particle devices to grant access to. For a small number of users or testing, you could just embed this logic in your server-side Javascript code.

As for the back end, that's up to you, but there are advantages of using node.js mainly because the Particle cloud API documentation is Javascript-centric, and particle-api-js is the only first-class API library. Also if you're mixing a combination of browser-side code and server code, using Javascript for both can make sense.

Also it's quite possible that you don't need an actual server instance. You just need an API server, and using a server-less solution like Amazon lambda, Google cloud functions, or Azure functions may make sense. All of those solutions also allow you to implement secure server-side access control and make Particle API calls. This can save money because the server is active only long enough to service the request. Also you don't need to maintain a server (even a shared virtual server), and you don't have to worry about things like timely application of server patches.

1 Like

Thank you for the insight.