Sending data to the AWS Lambda (API Gateway)

Hi guys,
I'm collecting data with my boron and i want to store data in the DynamoDB table but the docs section don't really show how to configure the integration if someone have a solution it will be really grateful .

Thank you,

Hi @hi06
Here are some links to create a DynamoDB table, create an IAM user and configure the DynamoDB permissions.

I've summarized those steps below.

You'll first need a table to get it's ARN:

  1. Go to DynamoDB > Tables > Create Table
  2. Name your table
  3. You might choose device_id for your "Partition key" and published_at for your "Sort key"
  4. Adjust your settings as needed, the default settings will work
  5. Click on your new table and go to General Information > Additional Info
  6. Copy the ARN of the database

Next you'll need to create an IAM user with the proper permissions:

  1. Go to IAM > Users > Create User
  2. Name your user
  3. Keep "Provide user access to the AWS Management Console" unchecked
  4. Choose "Attach policies directly"
  5. Click "Create Policy" (this will open a new tab)
  6. Choose "JSON" in the policy editor
  7. Paste the following (replace the resource field with your table's ARN that you copied earlier):
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "dynamodb:PutItem"
      ],
      "Resource": "<DYNAMO TABLE ARN>"
    }
  ]
}
  1. Choose "Next"
  2. Name your policy & save
  3. Go back to your IAM "Create User" tab, click the reload button, and select the new policy
  4. Click "Next" & "Create User"
  5. Click on the newly created user choose "Security Credentials"
  6. Scroll down to "Create Access Key"
  7. Select "Third-party service"
  8. Check "I understand"
  9. Choose "Create access key"
  10. Note the "Access key" field and the "Secret access key" field

Now go to your Particle Console and choose Integrations > AWS DynamoDB.

  1. Name your service
  2. Enter your DynamoDB table name
  3. Fill out your AWS region, Access key ID, and Secret access key
  4. Adjust your JSON data as needed

Try out some sample code:

void loop() {
  // Get some data
  String data = String(10);
  // Trigger the integration
  Particle.publish("dynamo-data", data, PRIVATE);
  // Wait 60 seconds
  delay(60000);
}
2 Likes

i followed all the steps but when i do the test i get an error see the screenshot below
image
image

I would check your user permissions in AWS. It's possible they have been misconfigured and are blocking traffic from Particle to your AWS resource (DynamoDB in this case).

Also confirm that you've correctly input the region, access key ID, and secret access key in the Particle integration configurator.

1 Like

i have tested with google sheet to see but i also get an error
image

Hi, a 401 HTTP error means:
a request was not successful because it lacks valid authentication credentials for the requested resource

So I would look into how you are entering the auth credentials (Bearer header if there is one, etc).

Also, please note that this is very hard to troubleshoot without looking at the request being sent, so I suggest you use an online tool (or whatever means) to receive the request yourself and look at it.

I describe how I do it in this post:

Can you look and double check your requests arrive with the shape and form is needed and you intended?

Best,

1 Like

thank you for the tip i used it and data was sent successfully to the pipedream but it did'nt change nothing to google sheet or aws dynamoDB

Ok, data will always be sent successfully to pipedream since it allows any traffic.

Now your task is to check the headers in the request and see if they match your expectations - or AWS and Google Sheets' ones.

1 Like