Could air quality affect signal quality and strength?

We currently have a device in NYC. Due to recent wildfires, New York city is experiencing extremely poor air quality ratings. Our devices in this city have been experiencing a higher than average number of errors. Could there be a relationship?

In my experience fire (as in actual flames) does interfere with radio signals since it can produce plasma that adds broadband interference across a wide bandwidth.

Smoke in the air is generally less of a problem since it is refractive and has a measurable but relatively small effect on radio propagation. I'm sure there are exceptions to this for specific frequencies and I would worry about line-of-sight signals at say 40 GHz a lot more than typically WiFi or cellular signals. Obviously as RF approaches the frequency bands of light the problems would likely get worse.


Thank you for the insights.

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.