MQTT and new AWS IoT service

Just for info, if anyone interested in the topic, I attended AWS re:Invent last week, when AWS announced their new AWS IoT service (beta) and the features seem quite promising.

Because of the scalability and easy interfacing with many services (storage, visualisation, notifications, etc…) I plan to test how the MQTT works with the AWS broker on my photons.

Will keep this thread updated with the results of my experimentations. If anyone already did play with photons (or cores) and the AWS IoT service, feel free to share.

AWS IoT Service

9 Likes

Any progress? I just started poking at it tonight. The main problem I see is that AWS IoT requires MQTT over SSL, and the bundled MQTT client in the IDE doesn’t support that at all. Depending on how much free time I have over the next few weeks I may try hacking on it, but if anyone else has a library that works already, I’d love to hear about it.

I started looking on the AWS IoT side so far to understand how things were working. I eventually had some tests working using mosquitto from the mac, without any photon involved, which helped clarifying things.

I just forked the SDK for the Arduino Yun and I’ll start adapting from there, instead of using the MQTT port, since there may be some significant difference (the arduino yun MQTT is based on PAHO but I believe the particle MQTT lib was derived from a mosquitto port on arduino). In any case, I want to use something close to the Amazon SDK, to avoid future trouble.

I’ll keep this thread posted.

1 Like

My bad… just realized that SDK uses linux stuff from the yun, inclusive it calls a python mqtt client…
I’ll see what I can extract from that code in terms of logic, and I’ll look for the paho client + possibly some TLS library (apparently the move is more toward TLS than SSL)


update 1

I dug into the arduino yun SDK for AWS, looked at the paho client, and the thing is, it relies on the TCP layer, tries to connect using TLS and fails over to unencrypted. I believe the main difference with Particle’s MQTT lib is, the latter doesn’t at all consider using TLS, although it may not be difficult to just add that in. It might be a more judicious decision to go with paho, since it is officially supported by IBM, whereas the current MQTT lib is a port from Nicholas O’Leary - who happens to also work at IBM - and he’s the only really active contributor to the project. Paho is developped under an Eclipse BSD license vs. Nicholas O’Leary’s MIT license use.

I’m inclined to go with paho, which seems to be ready for TLS use, and look into the TLS library @zach mentioned here and @mdma confirmed here. It would seriously bother me to have to wait for the end of the year to be able to connect using MQTT/TLS so I guess I’ll have to take care of that on my own… I saw a reference to TropicSSL
in the firmware repo on github, which is a fork from PolarSSL that’s not been updated for many years, so I may start with mbed TLS 2.1.12 which is the successor of PolarSSL that ARM took over and rebranded.

Anyways… MUCH more work than I initially planned :worried: But do I have a choice? even if I used AWS’s HTTPS access, I would have to use the HTTPS lib, which is GPL, which worries me a bit…

2 Likes

You could use an intermediate step that converts data to/from the particle photon to the amazon service, webhooks for example, or a encrypted coap gateway.

Direct MQTT would be nice, but keeping the devices to sensor collection and output triggering is cleaner and particle already provides a cloud that provides pub/sub for communication.

1 Like

It would definitely be an easier path to take, but thing is, Particle’s cloud will never compete with all the features AWS can offer, in particular in terms of storage, analysis, visualization, even though they were using AWS themselves (they’d have to build dashboards based on amazon’s, and consequently would spend more time reinventing the wheel than creating new stuff).

In the meantime, I can’t wait for things to happen on the Particle cloud front, and if the plan is to use a particle board - which so far looks like a great choice despite other valid alternatives out there - I want to start to be able to handle higher level features as soon as possible, which the current dashboard is far from offering. If I can avoid interfacing with different vendors (cloud, storage, visualization, etc…) to get the whole package, that’s the path I’ll take. Plus, the AWS prices are hard to beat for the service…

1 Like

Check this out which should help you get something setup in the near term. From a previous AWS “we do IoT” push in a previous RE:invent. I will say that it seems as though you want Particle to give you every possible piece to the puzzle so all you have to do is brand an app and call it a day. That seems highly unlikely given the breadth of what IoT can be. Anyways…it seems clear to me that Particle isn’t attempting to compete in this layer of the stack considering the fact that they haven’t done anything in that space.

I think it is always a good idea to separate Business decisions from a perceived shortcoming. Nothing in the AWS IoT announcement will address anything that is in the Dashboard today…different use cases. Fleet management versus data ingest… In fact nothing that was announced was a new capability from what I can tell. Follow the link and you will see what I mean.

http://hack-day.s3-website-us-east-1.amazonaws.com/spark-arduino.html

@all, sine particle TLS support is scheduled (hopefully) for the end of the year, another option is to use a Node.js proxy to publish to AWS IoT. We can use Webhooks to invoke Node.js scripts hosted somewhere and do the AWS publish. I am also working on such a demo and will post it here once completed.

1 Like

@LukeUSMC @krvarma Thanks for the link and suggestions, guys.

Using an intermediate relay is indeed a way, but that would require me to install another device at the user’s place - which may not even be a bad idea based on the features my app requires. Unfortunately, that also means putting there something else, like an embedded linux (beaglebone probably), and then I start wondering why not letting the beaglebone do it all instead of BB+photon…

I’m in fact not trying to have particle do everything and me just putting a brand on it, but I rely on the photon to do most of what I need locally, which is collect, control and send to processing/notifying, which will be hosted on amazon for obvious reasons. So what’s the point in having a webhook or even particle deal with any cloud part?

I still will have to 1-code the local work on the photon(s) / 2-do something with the data / 3-code the app on the mobile phone / 4-entertain the user… Not really simply branding :wink:

1 Like

Just wanted to throw out the official Particle stance on this. We're totally supportive of connecting devices to other web services, because no one company will be able to provide all necessary services for all devices. The IoT is too big. Right now, I think there are some things we do that AWS probably will never do (over the air firmware updates), some things that we do better than AWS (firmware libraries, device management tools, well-integrated hardware), some things that AWS does better than us (device shadowing), and some things that AWS does that we don't do (data storage, visualization, analysis, etc.). I think it's fine and natural to use both services for different things.

Although one point that I'll be a stickler on - I totally disagree that "AWS prices are hard to beat for the service." They charge $5/million messages; we don't charge at all for messages. If you were to send one message per second (the maximum with our rate limit) for the whole month, you'd send about 2.5 million messages per device per month, which would mean that each device would cost $13/mo. That doesn't sound cheap to me... :wink:

3 Likes

Thanks for the background, @zach. That’s super helpful.

I can’t speak for @peergum, but the storage is the big thing for me (and regardless of whether Particle offered storage, I need my stuff stored in AWS). I hope to have a lot of devices all over the place collecting data and dumping them into DynamoDB. AWS IoT claims to offer a fairly nice way to do that, with some additional functionality and flexibility beyond what I’m currently doing (API Gateway+Lambda). Not necessarily game-changing for my particular application, but enough to make me curious to play with it. Clearly not curious enough to actually start hacking on it yet, though :wink:

Not to go too far off-topic, but I’m really new to the Particle ecosystem, so I’m still getting the lay of the land. I’d previously been using Arduinos for the most part, so I’m used to having to roll my own for just about everything. On the one hand I love the additional functionality that the Particle cloud services offer. For example (as some of you may have seen from my other posts), I’ve been running into some trouble talking directly to the AWS API Gateway over HTTPS. As a stopgap, I am using Particle webhooks to talk to the API Gateway. Works great, and arguably is the right solution here.

But as someone with a lot of battle scars, I’m always hesitant to put more systems in the critical path than necessary. And for something as relatively simple as sending a small JSON blob to AWS every minute or two, an additional cloud layer on top makes me a little bit uneasy. That’s not a dig against Particle or the quality of your services, just an added single point of failure that seems extraneous for this particular use case.

2 Likes

Thanks for the feedback, @zach, and I understand your point(s). In my case, I have so far met some limitations with the Photon (which is still a wonderful platform for the price) and I will need to have probably a beaglebone or another linux board in my solution (I need to store and forward images captured from a local webcam and will require https among other things). Thing is, I may need to mix some devices and most of them won’t be able to talk to the particle cloud. My preference goes to AWS for the same reasons @sharding mentionned, as opposed to Microsoft - never been a fan. So, having all of them talk to aws would be a serious plus. I still can use webhooks, and I may use that in the meantime, but I also don’t want to much intermediaries in the solution, so that would probably just be a temporary solution. I also like much the “shadow” ability offered by aws.

Anyways, if Particle plans to have a proper TLS implementation very soon, it will definitely be a factor I’ll take into consideration. I’m not sure to have time to develop a library on my own - and maybe not even the technical skills - so whatever is in the pipe is good news at this point. I also wish (hope) that in the future Particle will work on a more complete solution embedding linux…

Thanks to @sharding also for his valuable comments and feedback.

Sean,

would you mind sharing details of the webhook you are using to the API Gateway?
It would greatly help me to start using some AWS IoT services.

thank you
Gustavo.

2 Likes

Sure, it’s pretty simple. The webhook is like this:

{
    "eventName": "my-event-name",
    "url": "https://amazon-api-gateway-url",
    "headers": {
        "x-api-key": "my-key"
    },
    "requestType": "POST",
    "json": {
        "key1": "{{val1}}",
        "key2": "{{val2}}"
    },
    "mydevices": true
}

And then the API Gateway calls a Lambda function that currently just dumps the data into DynamoDB in basically the same format it gets it. I’m not using any of the IoT stuff right now. My main goal at the moment is just to get the data into DynamoDB, and once I’ve gone as far as getting it to the API Gateway, I might as well just write directly to DynamoDB from Lambda… I’ll probably play with the IoT stuff some more soon, but this is working for the moment.

4 Likes

BTW, the Lambda code is ultra simple, basically this (with a little bit of data validation specific to my application):

var doc = require('dynamodb-doc');
var dynamo = new doc.DynamoDB();
var tableName = 'my_table';

exports.handler = function(event, context) {
    var write_params = {TableName: tableName};
    write_params.Item = {'key1':event.val1,
                         'key2':event.val2,
    };
    dynamo.putItem(write_params, context.done);
};
4 Likes

awesome, thank you for the information!

Hi,

I had the same problem but with the IBM IoT Foundation service. I wanted to get a secure connection and MQTT, since I’ve plenty of resources on Bluemix I simply wrote a message relay from Particle Cloud to IoTFoundation (around 100± lines of nodeJS code).

I guess a simple nodejs compute instance (around 256MB) should scale fairly well to relay multiple 100 devices.

Any updates on this? I’m deciding right now if I’m going to use my Intel Edison or Photon for a project, I’d like to use the Photon, but I need to be able to receive and send to the AWS IoT for the project I’m doing.

EDIT: Actually I’m not going to use AWS, but I’d still be interested in knowing if you can use AWS IoT with the Photon yet

1 Like

@KeithM : I have documented a way on using AWS in this article:


@sharding thank you again for sharing your webhook.

2 Likes

Oh, thanks for the link! I can’t believe I didn’t see that :slight_smile:

1 Like