Wifi.rssi never better than -44 dB

Hi all,

is it correct that the rssi that the spark reports through WiFi.RSSI can not get any better than -44 dB? I have connected my (antenna version) Core directly to the antenna plug of my router and am still getting “only” -44 dB where I would expect something in the low tens, or ideally 0 dB.

I am ussing the RSSI value for a scientific experiment (more in “project share” post up soon), so it is quite important to me to understand what the limitations of the Core are in this regard.

thanks in advance,


Hi @RolfHut

Do you have a lot of experience in RF work? Sometimes these concepts are not easy to explain in a post like this. I am not sure your described methodology of connecting the router’s antenna connection to the core’s antenna connection is a good one since you are no longer terminating the output impedance of the transmitter correctly. The value you get could be higher or lower depending on a lot of factors, but it pretty much can never be correct.

The TI CC3000 returns its RSSI measurement in dBm, that is dB relative to 1 mW, so 0 dBm would represent 1mW of RF power. That would be a lot of power for a receiver input to see and I am sure the TI part’s RF input is not designed for 0 dBm inputs.

Checking the TI part datasheet, on page 6 it lists the maximum input signal level as -10 dBm in 802.11b mode and -20 dBm in 802.11g mode. It also shows the receiver sensitivity in dBm for the various modulation schemes ranging from -75 to -97.5 dBm which represents the minimum signal value needed on the input to properly demodulate the incoming signal.

On the same page you can see the typical transmit power levels ranging from +18.3 to +14 dBm so if you were to reattempt your setup with the router and you did properly terminate the output impedance, you would surely be swamping the RF input on the TI part by several orders of magnitude. Usually in this kind of testing you use either a fixed or variable attenuator to reduce the signal level.

A quick look at the free space path loss for 2.4GHz shows that you are looking at around 60 dB path loss at 20 meters, so it not unreasonable for the receiver to expect signals in the -40 dBm range.

I think if you had a good test rig with an RF signal generator and proper power levels, you might be able to get the core to show -10 dBm or -20 dBm RSSI levels that are the specified maximum allowable input, but even then I would not bet on it. I think the RSSI measurement is designed for over-the-air attenuated signals and while it is not spec’ed that I could find, they might have designed it for the normally low levels seen at such inputs.

I am also certain that the absolute accuracy of the RSSI value returned is not guaranteed, since I know that measuring received power in chips is hard. The ratiometric differences between return values is likely to be fairly accurate so you can say that a value of -50 dBm is about 10 dBm stronger than -60 dBm.

I think that if you really want to know the limitations of the core’s RSSI values, you will have to borrow some RF test equipment and test it since no one else is counting on using this data in a precise way.


hi @bko,

thanks for your elaborate answer. I have a solid (mSc) background in physics, but no specialty in RF electronics.

thanks for finding (or already knowing) the reference power for the dB scale. My (somewhat naive) assumption was that there would be a standard power for all routers and that this would be the 0dB level…

What sparked my interest was that I could not, under any circumstance, get the signal better than -44 dB. I have made both a transmitting and receiving cantenna, which were pointed straight at each other, with 2 meters between them: -44 dB. Reducing the distance by half, resulted in -44 dB, where I would expect something like -41 dB… Connecting and basically swamping the receiver: -44 dB.

Since I needed the Spark to react to subtle changes in RSSI, I figured (before your reply) I needed to be in a lower power range. I now have a small (symmetric, high frequency ok) voltage divider that reduces my signal by about 20 dB (we have some excellent electronic support at Delft University, where I work)

I will post any results from my setup in “project share” as soon as available.


Hi @RolfHut

It sounds like you have already discovered part of your answer: The RSSI measurement is saturated at -44 dBm. Given that is 31 dB better than the minimum signal for the most challenging demodulation mode, I can see where the designers decided that was “good enough”. This is not spec’ed to the best of my knowledge but seems completely reasonable for the stated purpose of the RSSI indication.

I just wanted to correct one other small thing you said above. As you vary the distance by factors of 2, the power changes by 6 dB not 3 dB because of the 1/r^2 factor. The received power in dBm is the transmitted power (dBm) plus the two antenna gains/losses (dBi) plus 20log10( lambda/(4pir)). Here the 20log10() comes from the square that used to be around free space path loss inside the log10().

You might also be in the transition region for your antennas at 2 m. Even though the wavelength is 1/8 m and you are in theory 16 wavelengths away, your waveguide antennas may look electrically longer, increasing the region between the near- and far- field effects. My experience is that waveguides are funny and it is hard to say. At 10 m you could be sure you are far field.

It seems like you want to measure distance using the core’s RSSI. This is tricky since the transmitter and receiver negotiate a lower bandwidth and higher power connection and modulation scheme when there are transmission errors. For instance the TI part emits +14.0 dBm in 54 Mb/s mode but +18.3 dBm in 1 Mb/s mode. I am not sure if the core is reporting the data RSSI which would vary depending on the throughput selected or the beacon packet RSSI which I think is always transmitted at high power.

Let’s say your core is listening to an AP for RSSI, but is not associated with that AP. Now a new host associates with the AP at the limit of its range and the AP and the host negotiate to 1 Mb/s at higher power. Now your core listening sees the 4 dBm power change and decides that represent a 57% reduction in range instead of a change in transmit power. I don’t know how you can work around this if the core is measuring power during the data packets and not the beacon packets.

Sorry to be raining on your parade but I wanted to mention some of the problems that you may have so you can look out for them.