Recommendation for downloading binary files from my server

I have a project I'm working on where I want to download some binary files from my server and cache them to an SD card or local SPI flash.

The server is Ubuntu 20.04 running Laravel and PostgreSQL.

The devices that need the new binary files are Electrons and Borons.

The options as far as I can see are:

Embed files within device firmware releases

Embed binary files within the Particle device firmware that is flashed to the Electron or Boron via cellular network.


  • Easiest
  • Doesn’t use Particle Events
  • Whole OTA update mechanism is pre-built and “just works”


  • Takes up device flash memory
  • Requires device OTA cellular firmware update to update local files (interrupts device functionality temporarily)
  • Requires multiple device firmware releases in order to transfer/cache large amounts of data

Particle Cloud API Device Function Calls


  • Easy
  • Secure
  • Uses least cellular data?


  • Consumes particle events
    • approximately 50 per 32kB binary file
    • this would equate to 150 extra particle events per month per device in my case (2% of monthly total allowance)

TCP Client


  • Fast (compared to TCP)
  • Doesn’t use Particle Events


  • Uses more cellular data
  • Unsecure (will it even work with our sever then?)

UDP Client


  • Fast (compared to TCP)
  • Doesn’t use Particle Events


  • Uses more cellular data
  • Unsecure (will it even work with our sever then?)

TLS TCP Client - Particle Library


  • Doesn’t use Particle Events


  • Uses even more cellular data (additional encryption overhead several kB)
  • Uses a community library… not verified


Actually, in writing this, the choice seems clear:

I should use Particle Function calls and Particle Webhooks to sync with server:

  • Electron sends webhook containing the status/crc/version etc metadata for the files that it currently has stored locally (webhook gets forwarded via Particle webhook integration to my server)
  • Server checks local binary file metadata against its latest available versions that it has stored in its database
  • If difference exists, server begins sending chunks of the file to the device via Particle Cloud API Functions
  • my app caches the data in RAM, and returns the total file bytes written so far
  • a few ms later in my app asynchronously cache the chunk to local non-volatile memory
  • server continues to pepper the app with particle function calls until it has transferred all of the files
  • server can check on the status of the transfer at any time by reading a Particle Variable that I maintain which contains the same JSON string that I publish in the original webhook that was sent (e.g. file statuses, crcs, bytes written, etc etc)

I love the brainstorming forum?

Would love to hear anyone's thoughts on this.

There is a partial option not listed: Asset OTA. This isn't a solution for Electron, but it does work on Boron. Assets are shipped by the OTA system, but do not count toward the user application size (256K for Boron). Binary files are fine and you can store up to around 1 MB of additional data on the Boron. If your data is such that it only needs to be streamed sequentially (not random-access), you can leave the assets in system asset storage and don't need to copy them to external or internal flash, though you can do that if you want to or if you need random access.

1 Like

Once again Beldar, you have pulled me from the fire! (at least for my Boron and Argon devices)


Let me see if I understand "Asset OTA" properly, because I'm a bit unclear as to how it works w.r.t. the Boron memory map:

  • When I compile a firmware binary, I can also specify certain assets to be delivered during the the OTA update. Then Particle Workbench will create a .zip file that I can upload to the Product Dashboard Firmware section instead of the typical .bin file
  • Normal device firmware update happens to the Boron. The OTA stream is being written to the OTA Section of the external SPI chip
  • OTA transfer is completed, the app has been written to external SPI flash, the bootloader then burns the Boron flash from binary image that was just cached.
  • On boot, Particle OS sees that there are some assets to be delivered, goes into safe mode, and continues the OTA process, only now it is downloading assets to external SPI flash OTA Section
  • Once all assets have been synced, user app launched

Later during app runtime:

  • I can check if there are new assets and stream them to coprocessors if new assets are located
  • I can load the assets as a sequential stream from external SPI flash into RAM, and stream to a co-processor

Is that about the size of it?

Boron Memory map

nRF52840 flash layout overview

  • Bootloader (48KB, @0xF4000)
  • User Application
    • 256KB @ 0xB4000 (Device OS 3.1 and later)
    • 128KB @ 0xD4000 (Device OS 3.0 and earlier)
  • System (656KB, @0x30000)
  • SoftDevice (192KB)

External SPI flash layout overview (dfu offset: 0x80000000)

  • OTA (1500KB, @0x00289000)
  • Reserved (420KB, @0x00220000)
  • FAC (128KB, @0x00200000)
  • LittleFS (2M, @0x00000000)

Yes, that is correct.

Additionally, the assets are only delivered to the device if they have changed (name + hash), so if they don't change, you won't use any more data than just delivering the user firmware binary.

1 Like