Customer authentication options

Hi all

I’m working on a backend system to manage device data storage and server-side control of device functions.
We require the following:

  • Store historic device data
  • Control customer devices from our server (invoke Particle functions)

I’m currently using simple authentication, however I understand the recommended method for this sort of setup is to use two-legged.
I’d really like to continue using simple authentication if possible, as I think in a lot of ways Particle does a great job of the login / authentication process while allowing you to store notes about customer devices etc. on the cloud. (Why re-invent this, especially when paying for the higher tiers).

However, using simple authentication creates difficulty when my server needs to invoke functions on my customers devices. One option would be to send the customer access token to my server API when the customer logs in, however this will expire if the customer doesn’t login again within a set period, at which case my server will cease to be able to control the device.

Is there any way to get around this? Perhaps I miss the point but I don’t quite understand why Particle promotes customer management etc. as a value proposition for the scaled tiers, however doesn’t seem to have set it up or recommend it for any use past basic prototyping. I could use Amazon Cognito, but again it seems that I’m reinventing the wheel a bit!

@jeiden do you have any thoughts on this one?

1 Like

Hi @G65434_2!

Glad to hear you are making progress, and thank you so much for helping others on the community with your wisdom. Sounds like what you are looking for is some kind of hybrid between simple and two-legged auth. That is, you want your customers to authenticate directly with the Particle API from your mobile/web app, but you still want the flexibility to control devices from both the client and the server.

Unless I am overlooking something, I don’t see why you can’t have the best of both worlds! I would suggest that you use OAuth client credentials to create customer-scoped access tokens on your server. Follow the instructions here to create an OAuth client for your sever.

Then, your server could hit the API endpoint to create a customer-scoped access token, documented here.

Your server would need to know what the specific customer’s email address is to generate the token. Your client-side app would need to communicate that to the server in some way. One idea that comes to mind is when the customer is logging in, send their email address to your server to use to generate tokens throughout the session. However, depending on the specific use case you’re after, perhaps another approach might make more sense.

Hope this helps! Happy Sunday :smile:

Jeff

3 Likes

Thanks very much for your reply @jeiden

Not sure about how much wisdom I’m contributing though!!

I think your idea sounds good, I’ll give it a try. My only question, if a customer changes their email address, will this create difficulty? It’s no problem storing customer email addresses and their associated devices on my server, however should/can I also store a unique customer ID?

I’m in this space pretty deeeeeeeply right now. It’s dark down here!

We are trying key off of user DEVICE_ID almost exclusively. I’ve been messing around with an initialization that occurs on the server triggered by the device with an embedded hardware key. Once the device is connected to the cloud, device shoots the device id key along with the hardware creds… that action then creates the key in our noSQL user document. Subsequent restarts/powerups are ignored by the DB. Further user updates require only the hardware DEVICE_ID, which is easily presented.

Still wringing that out… that’s our strategy.

1 Like

@G65434_2,

Hmm I’m not sure Particle even allows updating a customer’s email address via our API currently, so not sure how this would happen realistically.

Jeff

2 Likes

Hi @jeiden

I’ve just started digging into this properly and I’m afraid I’m second guessing myself again!
While your solution could still work, I’m wondering now how my server will authenticate requests for data…
I’m planning on having some sort of API for accessing stored data. Although, I think that even though you’d need to know someone else’s deviceID, long term I probably won’t want just anyone being able to request data from my database if they know a deviceID - which suggests using OAuth like Particle does.
Keeping this in mind, it seems a bit ungainly giving the customer two APIs (Particles and mine) and having access tokens for each. I’m wondering if a better solution would be to have one API (my own) and pass related device calls through internally to Particles servers - having only my custom OAuth implementation facing the user, and generating customer scoped access tokens on my server (sorry I hope you’re still with me!)

I know there are many use cases for the backend and I’m not asking to be completely led through the design, I just thought I’d ask if there is a recommended method for this scenario? I understand the docs recommend that the client (web-app / smartphone app) communicate with Particle servers as it’s more direct:

However I do see benefits having just the one customer facing API and also being able to limit what can be invoked on the device. (Down the track I wouldn’t want users figuring out that the backend ran on Particle and then trying to mess around with their device via the cloud API).

Furthermore, I also thought it unusual that the docs had the following comment:

Surely there are plenty of applications that require the backend to autonomously invoke device functions?

@Dave perhaps you might have an answer. In short, is it recommended that we don’t ‘wrap’ the Particle API? If so, why? I realise there is another step in the process and there will be a small increase in latency, however it feels a bit clunky to offer our customers two APIs (our one for accessing historical data and one for setting their device settings). Is this a common problem for your customers?

Have you looked into Amazon’s IAM to create an authentication for the methods (i.e. Lambda) you wish to use to post data… yes, it becomes a kludge managing two APIs but IMHO better to wrap together for a seamless customer experience.

Are your customers coding/flashing/updating device firmware or are they only front-end facing?

1 Like

Thanks @BulldogLowell I think we’re working in a similar space at the moment!
Looking into IAM and Cognito at the moment. My customers will only be front-end facing.

I’m stuck between using as many of Particles services as possible at the moment, or doing most of it myself.
Ideally I’d like to forget about auth and just use Particles customer management model as I’m doing at the moment. However how will I validate access for my own API in that case?

At this stage I’m considering setting up my own auth, and then generating customer scoped access tokens server side when my wrapped API needs to invoke a Particle call. Although this probably isn’t what Particle like to hear, I feel this is safer from a business perspective as if for some reason Particle didn’t work for us later, i.e. because we needed some special feature that they couldn’t provide, then we wouldn’t need all our customers to change API endpoint.

On the flip side though, Particle does a lot of things very well and securely, why re-do the whole auth side of things when it’s already done?

Decisions…

1 Like

Particle plus access to a simple NoSQL database and custom API calls… we may be barking up the same fig tree.

DynamoDB has the Global Secondary Index tools that work well for us (fast and cheap queries!) so I’m in kludgy-ville for the time being as well. It was Lambda and API that initially drug us into AWS. Just too damned hard to reinvent those wheels…

That’s how they getcha, we say!

1 Like

Yea that’s pretty much what I’m after!
I wonder if a way around it could be:

  1. Login with Particle
  2. Send returned particle access token to our backend using our own API
  3. Use cloud API server side to verify that access token is valid and belongs to the device that customer is trying to use.

Proceed to call Particle functions / our own database functions knowing that we’re now authenticated. (Would probably still generate customer scoped access token to invoke Particle functions)

By the way, I’m not sure if it’s just me but AWS Cognito seems super clunky/complicated?

1 Like

Hi @G65434_2,

You can totally ‘wrap’ the calls to the Particle API by making them from within your backend application. That can be a good solution when you’re managing your own customer / user database, and want to just grab access tokens / create customer accounts programmatically, and / or you don’t want to have the Particle API visible to your customers directly, etc, etc.

Thanks!
David

1 Like

@G65434_2,

Sorry for the delay, have been on vacation and just got back :smile:.

I completely agree with you and @Dave. Having your server make function calls is totally valid, but would just add a bit of latency to each request (i.e. mobile app -> your server -> our server). If it simplifies things on your end, go for it! I definitely know of other product creators who use this approach for simplicities’ sake.

We should probably lighten up the language on the docs to not be so heavy-handed with our approach. I think we just wanted it to be clear that you can still manage your own DB of users / server while still interacting directly with the Particle API to control devices from a mobile or web application.

Sounds like the unified API on your end that wraps some of Particle’s API is the best approach for you. At some point, I’d love to see your progress and learn more about what you are building! You have been working through some of these challenges for a while now and am sure are well on your way!

Best,

Jeff

1 Like

Thanks @jeiden, no worries, hope you enjoyed your break!

I certainly hope I can share some progress and learnings with the community at some stage!

One question I have now for you or perhaps Dave is an authentication question in terms of future IFTTT compatibility.
I’ve managed to setup a system I’m really happy with using Amazon Cognito. Using their JS SDK, I can login a user via username / password and have a JSON Web Token returned. I can then use this JWT to authenticate with API Gateway and call a function via a server side generated Particle access token (customer scope).

This is all good, however I have a question about it’s implementation which I realise is slightly outside of Particle scope, but maybe things that some of your other customers will deal with…

How will I go about potentially becoming an IFTTT partner in the future? Will I need to implement OAuth2? If so, can I use this existing system using the JWT tokens (Amazon User Pools)? Or, am I better off setting up OAuth2 now and using it for all API use like Particle does? I don’t want to build a system now and find it was a bad decision later on when customers are already using my API!

@Dave is definitely the IFTTT expert :smile:. I will defer to him on this one.

Thanks @jeiden

@Dave I realise that this is quite a broad question and probably delves into some pretty complex areas in terms of different implementations. I understand that OAuth2 and JWT are quite different things, in fact JSON Web Tokens are often used in OAuth2 implementations. The difficulty I think comes down to understanding what requirements IFTTT places on the design of the authentication process and making sure that we build things right the first time. From reading the IFTTT documentation I understand that one must be able to generate a non-expiring refresh token - I don’t believe this is possible currently with Cognito User Pools so this may be a problem already. By the looks of it Particle has implemented their own OAuth2 server, although I understand this can be outsourced via Auth0, Stormpath etc?

Hi @jeiden

Sorry about the relentless questions, I’ve come across a bit of a wall of problems recently!
I’d like to know whether there is any value using Particle ‘customers’ if I’m using a two legged auth and wrapping all Particle APIs…
It seems that creating a Particle shadow customer, and generating Particle customer scoped access tokens every time I want to call a function is a bit unnecessary now. Are you able to give me an idea of any disadvantages I might face if I just leave all devices unclaimed? Can we even do this for a product? Furthermore will this affect other services such as SIM management?

It seems that there is a use case for a more stripped down version of Particle products that still allows for all the good things like OTA updates etc. but is only accessible server to server i.e. the end user never interfaces with Particle. I think for some customers like myself it will be a long time (understandably) before Particle can do everything we want e.g. data storage etc. - I realise you can’t be all things for all companies, so it would be nice if there was an option like this.

I really appreciate your help recently thanks!

Hi @G65434_2,

Good question! Last time I was working on the IFTTT bits, OAuth2 was a heavy requirement, and I suspect that’s still the case. It’s very achievable to find or implement your own OAuth server, there are a variety of modules and libraries out there that can help you. Mostly you just need to keep it current, and write the bits that connect to your user model.

When I built our IFTTT integration, I did it with the hopes that people building with Particle would be able to re-use the work, and piggyback their channels off our integration with IFTTT. That’s one benefit of using the customer accounts, but it’s not released yet, and would require some custom work to get right.

IFTTT has introduced a generic ‘webhook’ style integration, so you might be able to leapfrog your way into IFTTT without paying them for a channel, or setting up OAuth2, or going through that process in the meantime. :slight_smile:

I hope that helps!

Thanks,
David

1 Like

@G65434_2 yes this is emerging as a valid option for architecting your Particle product. My only hesitation is that I’m not sure that using OAuth client-generated access tokens will allow for successfully calling cloud functions / checking cloud variables on an end-user’s device.

Is this the case for your situation? If all that is required are publishing / integrations without functions and variables, you may be able to get away with not having customers. If not, you may run into problems.

You should try this for yourself, however. I remember testing this approach somewhat recently and running into issues with using these types of access tokens for device interaction.

Jeff

Thanks @jeiden

If I understand you correctly, I suppose the case would be that any function call, variable requests etc. would be made via our wrapped API, and then called using a single non-expiring access token on our backend. Or are you suggesting that a ‘master’ access token to access all product devices is potentially not possible?

One more question I had which has been asked before but not formally answered, is it ok to ‘create’ a shadow customer using a UUID instead of email?