Hi guys,
I'm collecting data with my boron and i want to store data in the DynamoDB table but the docs section don't really show how to configure the integration if someone have a solution it will be really grateful .
Thank you,
Hi guys,
I'm collecting data with my boron and i want to store data in the DynamoDB table but the docs section don't really show how to configure the integration if someone have a solution it will be really grateful .
Thank you,
Hi @hi06
Here are some links to create a DynamoDB table, create an IAM user and configure the DynamoDB permissions.
I've summarized those steps below.
You'll first need a table to get it's ARN:
device_id
for your "Partition key" and published_at
for your "Sort key"Next you'll need to create an IAM user with the proper permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"dynamodb:PutItem"
],
"Resource": "<DYNAMO TABLE ARN>"
}
]
}
Now go to your Particle Console and choose Integrations > AWS DynamoDB.
Try out some sample code:
void loop() {
// Get some data
String data = String(10);
// Trigger the integration
Particle.publish("dynamo-data", data, PRIVATE);
// Wait 60 seconds
delay(60000);
}
i followed all the steps but when i do the test i get an error see the screenshot below
I would check your user permissions in AWS. It's possible they have been misconfigured and are blocking traffic from Particle to your AWS resource (DynamoDB in this case).
Also confirm that you've correctly input the region, access key ID, and secret access key in the Particle integration configurator.
i have tested with google sheet to see but i also get an error
Hi, a 401 HTTP error means:
a request was not successful because it lacks valid authentication credentials for the requested resource
So I would look into how you are entering the auth credentials (Bearer header if there is one, etc).
Also, please note that this is very hard to troubleshoot without looking at the request being sent, so I suggest you use an online tool (or whatever means) to receive the request yourself and look at it.
I describe how I do it in this post:
Can you look and double check your requests arrive with the shape and form is needed and you intended?
Best,
thank you for the tip i used it and data was sent successfully to the pipedream but it did'nt change nothing to google sheet or aws dynamoDB
Ok, data will always be sent successfully to pipedream since it allows any traffic.
Now your task is to check the headers in the request and see if they match your expectations - or AWS and Google Sheets' ones.