Where is the Snowflake integration for Particle Cloud / how to export Particle data to Snowflake?

Hi everyone ,

I was reviewing the Particle Cloud platform page and saw references to routing device data to third-party analytics systems and external data platforms.

I’m specifically trying to understand how to integrate Particle device data with Snowflake for analytics and reporting, but I haven’t been able to find a clear, step-by-step guide or native connector in the docs yet.

Could someone clarify:

  1. Does Particle currently offer a built-in / native Snowflake integration?

  2. If not, what is the recommended architecture to move Particle event streams into Snowflake?

    • Webhooks to a custom API?

    • Streaming pipeline?

    • Third-party ETL tools?

  3. Are there any best practices for near-real-time ingestion and schema design for IoT telemetry in Snowflake?

  4. Any official docs, examples, or reference architectures you can point me to?

Our use case is ingesting sensor telemetry from Particle devices and making it available in Snowflake for dashboards, anomaly detection, and long-term analytics.

Thanks in advance for the help!

Hi @srirammamidi, while we don't have a native integration for Snowflake, you can easily configure your own using webhooks.

I have an example of how to do this with InfluxDB that would be a good starting point.

1 Like

There isn't a built-in solution, but another way to do it is use a Particle Cloud webhook to Stitch (or another similar intermediate service) that connects to Snowflake.

1 Like

What about the REST API functionality mentioned on the company website - can that be used for this type of integration as well? If so, what would the implementation steps look like, and are there any tradeoffs compared to using webhooks or an intermediate streaming/ETL service?

Yes, a custom webhook will ultimately make a REST API request against the endpoint of your choosing.

From Rick's recommendation, you'd configure your webhook to make an API request to Stitch, which is integrated with your Snowflake instance.

The flow is generally:

  1. Particle.publish() from your device to an event stream
  2. Webhook fires which was configured to listen to that event stream in the Particle Console
  3. The webhook makes a request to the endpoint you configured
  4. The receiving service does something with that data (stores it in Snowflake in your case)
2 Likes