Best way to upload 1000's of Telemetry packets?

I have a photon that is setup to collect and store sensor data. It reads from say 10 sensors every so often, and the resulting 10 data points is referred to as a packet. It will continue to collect and store packets until its memory is full. Once its memory is full I want to be able to post these packets (potentially thousands of packets) online to an API. Right now I am using webhooks to do that, but with webhooks I am stuck at a rate of 1 packet/second which is no where near fast enough.

What are my other options for posting this data? What might be some other avenues/libraries I should look into?

Any insight would be appreciated.

There are a few threads on there that show examples of how they pushed huge amounts of data to their FTP server using the Photon. They transferred 1 gigabyte in one test and didn’t miss any data.

I think it was @rickkas7 who did it so search for his FTP post and see what you find :smiley:

Currently the main con on using the on-board TCP features to upload that amount of data is that it’s unencrypted communication while Particle.publish() data is by default encrypted.
If you are happy with that, then TCPClient is your friend - if not, you might have to wait for one of the future system updates which might (at some point) bring ecryption for users too.

There are some libraries and threads that deal with ecryption, if you want to learn more.

2 Likes

I used HTTP POST to upload the data. You can get rates exceeding 400 Kbytes/second doing that, but as ScruffR said, encryption can be a problem, though there are workarounds to that too. A little more information about how big (in bytes), how often, whether you need to upload data by https (TLS/SSL encrypted), and whether you have the ability to run a separate server, even if it’s just a computer running the node.js program or even a Raspberry Pi on your home or office network or not can help narrow down the potential solutions.

3 Likes