Measuring cellular data usage from DeviceOS

Hey @Dan-Kouba , @rickkas7 et all,

When I see posts that ask about why a particular device is consuming that much data, it makes me wonder if there is a way from DeviceOS that we can find out exactly what is happening at the cell connection, on device.

For instance, can Device OS know when this case below is happening, so we increase a counter to watch out for in our user code, then do a particular move?

PS: I can imagine that many many factors impact the cell connection, and particular solutions (like antenna positioning, etc) may depend on the deployment type, enclosure used, etc. What I wonder in this post is if there are any means for a user firmware to detect and then potentially act accordingly if there is a way (who knows, sleep for an hour if last 5 sessions consumed a lot, whatever).

Thank you!

There is no good way to measure data usage on-device for most current devices.

If you are looking to characterize your baseline data usage as opposed to detecting a problem, the poorly-named electron-cloud-manipulator can be used to monitor the exact data usage by a single device in real time. It can also do things like introduce data loss so you can test your bad reception logic.

1 Like

@gusgonet, It's strongly my belief that the core of the issue in most cases is the fact that RF is very difficult to do right.

Nearly every time we are working with a customer who notices high data consumption, the root cause boils down to one or more of:

  1. RF: antenna selection, design, or placement relative to other electronics or metal
  2. Operating environment: physical locations and placement of devices relative to RF absorbers or emitters, or cellular infrastructure
  3. Power: insufficient current, noise, dropouts, unintentional radiation, etc

While more firmware-side instrumentation certainly could provide details or connectivity at the modem level, the fact that most mitigations are at the hardware level limit the value of that investment. If anything, those tools would simply confirm the existence of a hardware issue.

Additionally, our commercial models at deployment scale generally account for uneven usage across fleets — our philosophy around connectivity is "it just works", and as a result our intentions as a business is to make these sorts of details around connectivity something that our users do not need to spend significant energy optimizing and analyzing. There's always going to be extreme cases that require specific attention, but having a distribution of devices consuming different amounts of data across a fleet is normal and expected. More importantly, it means Device OS is doing what we designed it to do!

All that said, I think most developer energy should be initially focused on evaluating RF performance and power supply design. Only until those things are proven solid is digging into firmware the right next step.

Understood, thank you both your for insights.