Ingest data into AWS Kinesis directly from Particle Cloud

Hey,
Does anyone have experience or any idea to send a JSON payload to ingest data into AWS Kinesis?
I use Ubidots right now to visualize data but now I want to use AWS QuickSights for analyzing and visualizing it. Is it necessary to send the data to AWS IoT Core and then send it to kinesis from there or I can directly send it from Webhook to Kinesis. Once I have data in the Kinesis, I can load it into the QuickSights dashboard.
Thanks,
Hannan

1 Like

Hi @hannanmustajab thanks for posting, and welcome to the Particle community! I’m not personally familiar with AWS Kinesis, but I would expect you’d need to stream data into AWS IoT Core first and forward data to Kinesis. This is similar to the Azure IoT Hub approach. Do you happen to have a good doc I could look at on posting external sources into Kinesis?

Hi @bsatrom, Thanks for responding. I was able to get that done.
First I had to create a stream in Kinesis, Then made an API in the AWS API Gateway and connected it with Kinesis stream. And at the end, Setup Firehose and add data to it from the Kinesis Stream. Now the only issue I am facing is, regarding sending the data from Webhooks to the kinesis stream. Since I followed a guide, (https://medium.com/a-tale-of-2-from-data-to-information/how-to-build-an-event-pipeline-within-1-hour-and-minimum-lines-of-code-in-aws-eb1bd0bb6cd2 ) .He created this request to feed data into it. I’m not sure if I have to create a similar json object.

{
    // the templates argument that will be replaced from the URL 
    // <api>/Stream/{stream-name}
    "StreamName": "$input.params('stream-name')",
    // Data that we received + a NEW LINE Base64 encoded (Cg==)
    "Data": "$util.base64Encode($input.json('$.Data'))Cg==",
    "PartitionKey": "$input.path('$.PartitionKey')"
}

Thanks Hannan.

Yes, when you set-up your Webhook, you’ll need to create a custom JSON Payload that includes whatever Kinesis expects (StreamName and PartitionKey, I expect) and pass in the event data from your devices into the Data field. Docs on this are here: https://docs.particle.io/tutorials/device-cloud/webhooks/#custom-template

@bsatrom
This is how my Webhooks looks right now.

{
  "Data": {
      
  "Alerts": "{{Alerts}}",
  "Resets": "{{Resets}}",
  "Battery": "{{Battery}}",
  "Distance": "{{Distance}}",
  "Temperature": "{{Temperature}}"

  },
  "PartitionKey": 1
}

On testing it, I get this error.

HTTP/1.1 200 OK
Date: Wed, 15 Apr 2020 08:39:54 GMT
Content-Type: application/json
Content-Length: 35
Connection: keep-alive
x-amzn-RequestId: 5ed636fb-fa66-4f23-a143-e8a3df42cf8c
x-amz-apigw-id: LBQ-MGSLiYcFgBw=
X-Amzn-Trace-Id: Root=1-5e96c85a-9df4e2c0d48fc800ad1a8f80

{"__type":"SerializationException"}

@hannanmustajab I assume that your Alerts et al are well-formed JSON attributes inside of the Particle publish data payload, correct? If so, you’ll want to use the triple curly brace format for extracting those into your custom payload. Like so…

{
  "Data": {  
    "Alerts": "{{{Alerts}}}",
    "Resets": "{{{Resets}}}",
    "Battery": "{{{Battery}}}",
    "Distance": "{{{Distance}}}",
    "Temperature": "{{{Temperature}}}"
  },
  "PartitionKey": 1
}

I am still getting the same error. I think it has something to do with the JSON template which I added in API Gateway.

Can you copy a screenshot of your custom payload?

I’m not sure what you mean by custom payload. I’ll attach all the screenshots.

These are all the screenshots. You can refer to this document for getting an idea. It think it accepts binary data, but I saw that there are people who somehow converted it .

@bsatrom


This looks something similar, I tried doing this but that doesn’t work either.

Hi @hannanmustajab ,

I found your problem, because I had the same thing occur.

Update your mapping template to below.

this issues you are seeing is due to a “new line” character that the tutorial adds.

AND…

you must re-deploy your API, use that new URL and you should have data saved nominally.

1 Like

Hi @Adam42,
I figured it out and it works now. Thanks for your response. Have you worked with Kinesis to send data to S3?
If yes, Then there’s something else that’s causing problem. Can you help with that ?
Thanks
Hannan

Hi @hannanmustajab

Yes, I was able to send it to S3 as well.

What else are you having trouble with?

Keep messaging and I’ll do my best.

Hi,
Thanks for replying. I am actually working with Kinesis stream to send data to S3 using firehose. Right now I am just prefixing all the data with something like “xyzIoTProduct”. It was working fine with quick sights until I started sending data from another product. Data would show up in AWS S3 bucket but it won’t show in the QuickSights dataset. After doing a bit research, I found out that all the records should have same number of columns. But in my case, One product has 3 fields while other has 21.
Please refer to this link. Can you suggest me how to fix it ?
What I thought was to create a lambda function which would check each record at the time of ingestion and check device ID stored in a dictionary and then map it to the project. Depending on that it would send that file to a particular directory in S3 and I would add that to the manifest.json file. Can you tell me some other way to get this done as I am not at all familiar with Lambda function.
Thanks
Hannan

Hi @hannanmustajab,

Hmm, that does seem like a tricky problem.

Unfortunately I don’t know how to solve that.

But it seems you are on the right track with breaking the problem up into smaller pieces

1 Like

Have you used lambda to add data to S3 ?
@Adam42

You know I haven’t done that before.

I’m typically just going from Kinesis directly to S3.