I’m currently using analogRead to read the value of the A3 pin, but unfortunately the values are coming in ~20 counts lower than I would expect. With a multimeter, I read 2.023V on the input, which I would expect translates to 2.023/3.317*4095 = 2497, but I am seeing ~2470.
However, when I set a longer ADC sample time, and then sample the ADC several times I eventually get the correct reading. I performed this test with the following code:
void setup() {
Serial.begin(9600);
}
void loop() {
setADCSampleTime(ADC_SampleTime_3Cycles);
Serial.print("ADC Reading with 3 Cycles: ");
Serial.println(analogRead(A3));
Serial.print("ADC Reading with 3 Cycles: ");
Serial.println(analogRead(A3));
Serial.print("ADC Reading with 3 Cycles: ");
Serial.println(analogRead(A3));
delay(5000);
setADCSampleTime(ADC_SampleTime_480Cycles);
Serial.print("ADC Reading with 480 Cycles: ");
Serial.println(analogRead(A3));
Serial.print("ADC Reading with 480 Cycles: ");
Serial.println(analogRead(A3));
Serial.print("ADC Reading with 480 Cycles: ");
Serial.println(analogRead(A3));
delay(5000);
}
And got the following result
However, if I comment out the lines for setADCSampleTime, I get no improvement with successive samples.
Can anybody explain what’s going on here? Or if you have any recommendations for getting consistent ADC readings? I’ve read several other similar threads like Interesting results from various ADC sample times so I suspect this is some issue with ADC sampling and input impedance or a bug in the particle ADC HAL. What I don’t really get is that the sensor I’m reading is a voltage divider buffered by an op amp wired directly to the input, so it should have pretty low output impedance. I did also scope the input, and am seeing 30mV of noise, but I think my test proves there’s something more than a noisy input here.
Here’s a couple additional questions I also can’t find the answers to:
What is the default ADC sample time?
Am I correct in saying that that increasing the sample time (e.g. 480 cycles) that I am allowed a higher output impedance from my sensor to the ADC?
This 30mV of noise is somewhere around 37 steps in the ADC depending on the reference voltage, so that could completely explain your problems.
The ADC works by essentially comparing the charge on an internal capacitor to the voltage applied to the input and then tweaking the internal capacitor voltage up or down to make the internal cap closer to the external input voltage. Each tweak (up or down) is used to determine one bit of the result.
If the input is noisy, the longer the conversion takes, the more noise is averaged into the reading. If the noise is zero mean, like a 50/60Hz sine wave or thermal noise, then this averaging tends to remove noise and give a better reading.
It is also possible for noise on the 3.3V supply to the ADC to show up in the results.
Can you retry your experiment with a low noise voltage source, like a 1.2V coin cell?
How about shorted to GND as a low noise option? I tried that, and got these results
That 205 counts on the first sample could not possibly be electrical noise. I really think noise is a red herring here- if it were noise, I would get randomly high and low readings, not consistently low. Successive ADC reads should not always trend higher if that were the case.
I’d say that the consistently different first reading represents a higher state of capacitance on the pin during the first read.
try reading three times with a known interval (fill an array with a delay) but dump the first reading:
setADCSampleTime(ADC_SampleTime_3Cycles);
int analogBuffer[3] = 0;
analogRead(A3);
for (int i = 0; i < 3) i++;
{
delay(10);
analogBuffer[i] = analogRead(A3);
}
// print the array...
I would try it with the cellular radio turned off and only battery power. It could be a software problem, but then we would see it on Photon too I would think.
Now thats weird. With cellular off and battery power + USB here’s what I get (A2 shorted to GND)
I also replicated it with another electron. This seems to only be a problem when using setADCSampleTime - it doesn’t occur when just analogRead is called.
Well now I feel a little silly - when I was finding my ADC readings too low, I was using my boards 3.3V as the reference in the math, when I should have been using the electrons. With the correct 3.3V (3.34V) instead of the 3.317V from the first post everything checks out.
Still though, there is something buggy about setADCSampleTime(). Its not only the first call to analogRead() after setADCSampleTime() that has trouble; if you out in a 1s delay between calls, the readings also don’t come out correctly.
Speaking of which, any idea how consistent I can expect the electron 3v3 reference to be? We’re deploying a lot of these, so I don’t want to have to measure all of the 3.3V references before sending them to the field. I looked at the regulator datasheet but didn’t see any specs for output accuracy. I measured 2 electrons I have, and they were both 3.34, but just wondering if you all had seen similar.