This may already have been discussed (in which case moderators, feel free to close it!) But I have been getting some interesting but repeatable results with the A/D converter on my Particle Core. Basically, when changing the sample time with
setADCSampleTime, I’m a little confused.
From my reading on the topic of ADCs, the longer the sample time, the higher the input impedance, as a result of the internal charge capacitor on the ADC.
My test rig:
- A4 shorted to 3.3v (regulator)
- A5 connected to 3.3v with a 22K resistor; 0.1µF to GND.
ADC_SampleTime_1Cycles5 -> A4: 4095, no noticeable change over the default setting.
ADC_SampleTime_7Cycles5 -> A4: 4095 (default); A5 reads 3870
ADC_SampleTime_28Cycles5 -> A4: 4094. A5 reads 3978 (Better, more what I expected.)
ADC_SampleTime_41Cycles5 -> A4: 4094. A5 reads 3978 (same)
ADC_SampleTime_55Cycles5 -> A4: 3889. A5 reads 3815 (WORSE!)
ADC_SampleTime_71Cycles5 -> A4: 3890 (didn’t check A5)
ADC_SampleTime_239Cycles5 -> A4: 3870!! (ditto)
If I understand the ADC concept correctly, the impedance should go up with the longer sample time, but it appears to fall below zero, as you really can’t go less than a 2" length of wire.
Interestingly, if I remove the 0.1µF capacitor, the A5 readings get closer to the A4 readings. That could be explained by the Particle firmware repeatedly reading the value to get a smoother average (if I understood some other threads correctly), which would drain the capacitor.
I’m aware that some other users have really dug deep into the chip, writing system registers, etc. I’m not looking to do any of that (good luck porting your code to the Photon!), but found this interesting. Any comments? Or am I a little close to upside-down on this topic?