MQTT and new AWS IoT service


#9

@LukeUSMC @krvarma Thanks for the link and suggestions, guys.

Using an intermediate relay is indeed a way, but that would require me to install another device at the user’s place - which may not even be a bad idea based on the features my app requires. Unfortunately, that also means putting there something else, like an embedded linux (beaglebone probably), and then I start wondering why not letting the beaglebone do it all instead of BB+photon…

I’m in fact not trying to have particle do everything and me just putting a brand on it, but I rely on the photon to do most of what I need locally, which is collect, control and send to processing/notifying, which will be hosted on amazon for obvious reasons. So what’s the point in having a webhook or even particle deal with any cloud part?

I still will have to 1-code the local work on the photon(s) / 2-do something with the data / 3-code the app on the mobile phone / 4-entertain the user… Not really simply branding :wink:


#10

Just wanted to throw out the official Particle stance on this. We’re totally supportive of connecting devices to other web services, because no one company will be able to provide all necessary services for all devices. The IoT is too big. Right now, I think there are some things we do that AWS probably will never do (over the air firmware updates), some things that we do better than AWS (firmware libraries, device management tools, well-integrated hardware), some things that AWS does better than us (device shadowing), and some things that AWS does that we don’t do (data storage, visualization, analysis, etc.). I think it’s fine and natural to use both services for different things.

Although one point that I’ll be a stickler on - I totally disagree that “AWS prices are hard to beat for the service.” They charge $5/million messages; we don’t charge at all for messages. If you were to send one message per second (the maximum with our rate limit) for the whole month, you’d send about 2.5 million messages per device per month, which would mean that each device would cost $13/mo. That doesn’t sound cheap to me… :wink:


#11

Thanks for the background, @zach. That’s super helpful.

I can’t speak for @peergum, but the storage is the big thing for me (and regardless of whether Particle offered storage, I need my stuff stored in AWS). I hope to have a lot of devices all over the place collecting data and dumping them into DynamoDB. AWS IoT claims to offer a fairly nice way to do that, with some additional functionality and flexibility beyond what I’m currently doing (API Gateway+Lambda). Not necessarily game-changing for my particular application, but enough to make me curious to play with it. Clearly not curious enough to actually start hacking on it yet, though :wink:

Not to go too far off-topic, but I’m really new to the Particle ecosystem, so I’m still getting the lay of the land. I’d previously been using Arduinos for the most part, so I’m used to having to roll my own for just about everything. On the one hand I love the additional functionality that the Particle cloud services offer. For example (as some of you may have seen from my other posts), I’ve been running into some trouble talking directly to the AWS API Gateway over HTTPS. As a stopgap, I am using Particle webhooks to talk to the API Gateway. Works great, and arguably is the right solution here.

But as someone with a lot of battle scars, I’m always hesitant to put more systems in the critical path than necessary. And for something as relatively simple as sending a small JSON blob to AWS every minute or two, an additional cloud layer on top makes me a little bit uneasy. That’s not a dig against Particle or the quality of your services, just an added single point of failure that seems extraneous for this particular use case.


#12

Thanks for the feedback, @zach, and I understand your point(s). In my case, I have so far met some limitations with the Photon (which is still a wonderful platform for the price) and I will need to have probably a beaglebone or another linux board in my solution (I need to store and forward images captured from a local webcam and will require https among other things). Thing is, I may need to mix some devices and most of them won’t be able to talk to the particle cloud. My preference goes to AWS for the same reasons @sharding mentionned, as opposed to Microsoft - never been a fan. So, having all of them talk to aws would be a serious plus. I still can use webhooks, and I may use that in the meantime, but I also don’t want to much intermediaries in the solution, so that would probably just be a temporary solution. I also like much the “shadow” ability offered by aws.

Anyways, if Particle plans to have a proper TLS implementation very soon, it will definitely be a factor I’ll take into consideration. I’m not sure to have time to develop a library on my own - and maybe not even the technical skills - so whatever is in the pipe is good news at this point. I also wish (hope) that in the future Particle will work on a more complete solution embedding linux…

Thanks to @sharding also for his valuable comments and feedback.


#13

Sean,

would you mind sharing details of the webhook you are using to the API Gateway?
It would greatly help me to start using some AWS IoT services.

thank you
Gustavo.


#14

Sure, it’s pretty simple. The webhook is like this:

{
    "eventName": "my-event-name",
    "url": "https://amazon-api-gateway-url",
    "headers": {
        "x-api-key": "my-key"
    },
    "requestType": "POST",
    "json": {
        "key1": "{{val1}}",
        "key2": "{{val2}}"
    },
    "mydevices": true
}

And then the API Gateway calls a Lambda function that currently just dumps the data into DynamoDB in basically the same format it gets it. I’m not using any of the IoT stuff right now. My main goal at the moment is just to get the data into DynamoDB, and once I’ve gone as far as getting it to the API Gateway, I might as well just write directly to DynamoDB from Lambda… I’ll probably play with the IoT stuff some more soon, but this is working for the moment.


#15

BTW, the Lambda code is ultra simple, basically this (with a little bit of data validation specific to my application):

var doc = require('dynamodb-doc');
var dynamo = new doc.DynamoDB();
var tableName = 'my_table';

exports.handler = function(event, context) {
    var write_params = {TableName: tableName};
    write_params.Item = {'key1':event.val1,
                         'key2':event.val2,
    };
    dynamo.putItem(write_params, context.done);
};

#16

awesome, thank you for the information!


#17

Hi,

I had the same problem but with the IBM IoT Foundation service. I wanted to get a secure connection and MQTT, since I’ve plenty of resources on Bluemix I simply wrote a message relay from Particle Cloud to IoTFoundation (around 100± lines of nodeJS code).

I guess a simple nodejs compute instance (around 256MB) should scale fairly well to relay multiple 100 devices.


#18

Any updates on this? I’m deciding right now if I’m going to use my Intel Edison or Photon for a project, I’d like to use the Photon, but I need to be able to receive and send to the AWS IoT for the project I’m doing.

EDIT: Actually I’m not going to use AWS, but I’d still be interested in knowing if you can use AWS IoT with the Photon yet


#19

@KeithM : I have documented a way on using AWS in this article:


@sharding thank you again for sharing your webhook.


#20

Oh, thanks for the link! I can’t believe I didn’t see that :slight_smile:


#21

Any news, good or bad, on getting support for either MQTT-TLS or HTTPS in the Photon?
Thanks :smile:


#22

Any one get this working with AWS IOT?


#24

Hello Sharding,
i want to update a device shadow through lambda function,and am new to lambda functions and not very well used to lambda nodejs coding,so can u pls help me on how to update a device shadowing by using lambda functions.
Hope you do the needfull.
Thanku.


#25

Any one get this working with AWS IOT?

Presumably it won’t happen until there is legit TLS support on Particle. Here’s an effort/discussion: Any mbed TLS efforts yet?

The “what about us???” attitude from Particle seems to miss the point, many of us have large stacks on AWS already, which makes service integration much faster (especially given limited http/https support, which means REST can’t be used with existing solutions).


#27

I use the programmer shield to do re-flash the photon to
have a direct connection to AWS IOT. By doing that you will loose all
the feature of the original firmware.

I successfully connect to AWS IOT from Photon.

I found this article on Broadcom’s forum. (I post my modification on this forum too)

Download and unzip the WICED-SDK-3.5.2.7z.zip (need to register)

By following the article you will create a new profile.
If you follow the step by step at the end of the post you will modify an existing profile.
This will allow you to run Broadcom’s example where you can connect to AWS IOT through MQTT.


#29

+1 for AWS IoT SDK support. I know it’s not easy but would be great to have it officially supported in particle (or at least a MQTT/TLS with client side certificates for device authentication).


#30

+1 for AWS IoT support

The Adafruit WICED WiFi Feather now supports AWS IoT: https://blog.adafruit.com/2016/07/26/adafruit-wiced-now-supports-amazons-aws/


#31

The Adafruit WICED WiFi Feather now supports AWS IoT

Whoa, that’s huge.

…anyone want to buy my assorted Particles?