I’ve just added some more to this comment
Given the original value of
0.0048 as a “shorthand” for
5.0 / 1023.0 which is more like
0,00488759 as well as the “arbitrary”
20.5128 and the fact that your actual voltage might not be exactly 3.3V, precision should not be the prime concern
For my part I often limp between very verbose maths in programs, to remember why and how a value needs to be calculated a certain way, or the other extreme to make my code as short, fast and most direct.
So it’s mainly up to you, which of these extremes or anywhere in between you want your code to be. But as you’ve seen with the original - too short doesn’t always help
But if you say “wild swings” of what kind of magnitude are we talking here?
There are factors that do influence your measurement:
- time between measurement (faster is often less precise)
- self heating of your sensor (should be no issue with power on/off and little current flowing)
- human perception (digital numbers seem to change a lot, while the value actually only changes in a minor decimal place)
To counter act these things, slow down your repetitions, maybe incorporate some averaging (e.g. moving average), draw a graph rather than look at digital numbers.
In this forum there are some threads that do deal with ADC precision, have a look there for more hints.
I’ve just had a look at the datasheet of the sensor and given this
• Wide temperature range: -20 Celsius to 133 Celsius
• Accuracy: ± 1°C
• Fast reading time: <1 ms
• Ultra low operating current: 6 μA
• Wide operating voltage range: 3.1V to 5.5V
• Maximum Vout = 3V
you shoulnd’t need to worry too much about precision