NDIR Instrument Development

Hi all. I am new to this forum and looking for some help on a project I have been working on. I am using a redbear duo processor board and communicating over SPI with an ADC (LTC1859). I need to synchronize the ADC reads with an infrared lamp modulation routine. I am using an existing library, SparkIntervalTimer (Copyright © 2014 Paul Kourany, based on work by Dianel Gilbert) that fires every 100 us. In this timer routine, I increment a counter and based on this counter I turn an infrared light source on and off, and I also take readings from the ADC during this time. All seems to be working as planned, however, there seems to be some inconsistency in the timing. If I graph out the data over night, it seems to be on a large sine wave, meaning that overtime through this routine, I pick up readings at a slightly different time. It is my understanding that interrupt routines should be as simple as possible, but I can’t seem to simplify it any more than I already have. The ADC reads are just bit banging pins so the timing involved there should be governed by the timing of the read sequence within the IC. I am looking for suggestions as to the best way to synchronize the ADC reads with the lamp modulation and still be able to run calculations on the data without interfering with the timing.

Thanks,
Andrew

If you could show us some code, we’d know what we’re working with :wink: Then, the master himself, @peekay123, might be able to shed some light on this.

@rickkas7 is also a never ending source of insight, especially when it comes to using HW timers.
Till you get some response from either of the two you could have a look at this repo where Rick shows how to use the onboard ADCs driven by a HW timer.

Ok, thanks for the reply. The ADC code is below. I need to read from a 4 channel detector so I basically read one channel after the next as fast as possible.

    void adcDriver() {
  
  for (int i=0;i<4;i++){
    x=SPI.transfer(commands[i]);      
    y=SPI.transfer(B00000000);    

    System.ticksDelay(2*System.ticksPerMicrosecond());
    
    pinSetFast(CONVST);
    pinResetFast(CONVST);
    
    System.ticksDelay(4*System.ticksPerMicrosecond());
    
    data=word(x,y);
    word adc_code = 0;
    adc_code = data;
    int voltage = 0;
    int sign = 1;
    int _NUMBER_BITS = 16;

    if ((adc_code & 0x8000) == 0x8000) {  //adc code is < 0
      adc_code = (adc_code ^ 0xFFFF)+(1<<(16-_NUMBER_BITS));   //Convert ADC code from two's complement to binary
      sign = -1;
    }
      adc_code = adc_code>>(16-_NUMBER_BITS); //shift out zero bits (2 for 14-bit, 4 for 12-bit)
      voltage = sign*(int)adc_code;

    if (i == 0) {
      channel3Loop = voltage;
      if (channel3Loop >= 0) {
        channel3HighTest += channel3Loop;
        channel3CountHigh += 1;
      }else if (channel3Loop < 0) {
        channel3LowTest += channel3Loop;
        channel3CountLow += 1;
      }
    }else if (i == 1) {
      channelRefLoop = voltage;
      if (channelRefLoop >= 0) {
        channelRefHighTest += channelRefLoop;
        channelRefCountHigh += 1;
      }else if (channelRefLoop < 0) {
        channelRefLowTest += channelRefLoop;
        channelRefCountLow += 1;
      }
    }else if (i == 2) {
      channel1Loop = voltage;
      if (channel1Loop >= 0) {
        channel1HighTest += channel1Loop;
        channel1CountHigh += 1;
      }else if (channel1Loop < 0) {
        channel1LowTest += channel1Loop;
        channel1CountLow += 1;
      }
    }else if (i == 3) {
      channel2Loop = voltage;
      if (channel2Loop >= 0) {
        channel2HighTest += channel2Loop;
        channel2CountHigh += 1;
      }else if (channel2Loop < 0) {
        channel2LowTest += channel2Loop;
        channel2CountLow += 1;
      }
   }
  }
}

My lamp timer is initiated as such…

lampTimer.begin(lampDriver2, 50, uSec, TIMER7); //Used for testing Signal Generator

And my Lamp Driver routine looks like this…

void lampDriver2() {

  lampCounter += 1;

  if (lampCounter == 1) {
    pinSetFast(lampPin);
  }else if ((lampCounter > 450) && (lampCounter <= 650)) {
    adcDriver();
  }else if (lampCounter == 1650) {
    pinResetFast(lampPin);
  }else if ((lampCounter > 2150) && (lampCounter <= 2350)) {
    adcDriver();
  }else if (lampCounter == 2351) {
    readNow = true;
  }else if (lampCounter == 3300) {
    lampCounter = 0;
  }

I receive a sine wave out of the detector and I want to measure the highest say 50-100 points as well as the lowest 50-100 points. Initially I tried placing these values into an array and then running a bubble sort on them to make sure I always got the highest and lowest point, however, this was very expensive as it involves multiple arrays. The way it is setup now is I basically graphed out the data and determined the best window to capture the points. This is why I only sample the ADC during specific lampCounter points. I now add them to a variable and keep track of how many points were added so I can get an average. The void loop() routine is constantly looking at the readNow variable and when this is true, I perform calculations. The calculations are pretty involved so i wanted to makes sure they weren’t in this interrupt routine, but even when a signal generator is used, I seem to get what looks like drift, basically I think I am picking up different point of the sine wave each time through.

Thanks,
Andrew

@amertz03, due to the HAL implementation, there is an overhead of 5 to 10us on timer interrupts. From what you show, you are running the ISR at 50us intervals, not 100us. Did you measure the execution time of adcDriver() with an oscilloscope? Assuming ADC conversion time of 5uS, the delays and the extra code, the ISR could easily push the 50us window you currently have set.

As it stands, your timing is hard coded and assumes zero latency differences between interrupts. You also assume no other interrupts will pre-empt your ISR which may not be the case. In the last 0.8.0-rc1+ DeviceOS firmware a non-HAL attachInterrupt() function is now available reducing latency considerably. However, I haven’t yet adapted SparkIntervalTimer to use this feature yet.

Thanks for the response. I made a mistake in the code I posted, the majority of the testing has been done with that timer set to 100 us, however, I also tried it at 50 us thinking that I would get more reads in quicker, therefore, get close to measuring the same point on the wave from each channel. If I understand your response correctly, you are saying that the overhead associated with the interrupts could be in the realm of 5 to 10 us due to the HAL implementation. I normally run the timer at 100 us to be sure that my additional code and processing never comes close to this period. I have confirmed this by measuring the time and generally it lasts around 50 us. This confirms your assumption here. Could you provide an example of a non-HAL attachInterrupt? Also, is it safe to assume that if you measure the timing for all code processing run during the interrupt routine, and confirm that it’s time is less than that of the interrupt, that it is safe to run that way? I always read that it is important to minimize the code in the interrupt routine, however, I always assumed if the timing for that code was well within the timing of the interrupt, that it was safe? Is this assumption incorrect?

Thanks

@amertz03, for the non-HAL attachInterrup(), I will be updating SparkIntervalTimer to do this automatically by detecting the DeviceOS version being compiled. Hopefully I can get to it this weekend.

It is always good to understand the worst case timing for an ISR which I usually do by setting and resetting a GPIO pin and measure it with an oscilloscope. Regardless of timing, DeviceOS interrupts are set to a higher priority than the SparkIntervalTimer interrupts, by design. So, it is possible that your ISR will be preempted by a system interrupt. Can you explain the nature and characteristics of the signal you are sampling?

@peekay123 Thank you for the prompt response. The signal we are measuring is coming from a pyroelectric detector that has a built in amplifier so the output of each of the 4 channels is ± 5V. We modulate the light at a specific frequency (3.3 Hz as used in the example) and the detector essentially outputs a sign wave centered around zero. The detector response goes negative when the light source is on, and positive when the light source is off. We then take the high readings minus the low readings and use this value for further calculations. The detector reaches maximum output voltage at some time delay from when the lamp is turned on. This is why I wait until the counter reaches 450 before I start reading. Please let me know if you need any additional information.

@amertz03, you are sampling a 3.3Hz sine wave at 10KHz so you are well beyond Nyquist requirements and way beyond oversampling IMO. Even at 660Hz, that represents 100x oversampling! To remove timing sensitivity, would it not be better to write a simple peak or inflection point detector to give you the high/low values?

Is it safe to assume that you have a filter on the receiver filter out 60Hz and other noise?

1 Like

Yes, we have a high pass filter at about 1 Hz to remove the DC offset from the detector as well as a lowpass filter at about 7 Hz to remove any high frequency noise. I also agree we are way beyond oversampling, the main problem is related to the fact that one of these 4 channels is a reference channel, so the other 3 channels measurements are used with respect to the reference channel, and when we pick up different points of the peak overtime, we get what looks like drift and then ultimately circles back around giving us what looks like a large sign wave. One option we have considered would be a sample and hold ADC, but we believe we have 2 problems. Problem 1 is that there is approximately a 25 us delay between sampling channel 1 and channel 4, which causes us to pick up different points in the peak. We confirmed this by using a signal generator at 3Hz and measuring on all channels and they all results in different measurement results. The second problem is that over time, it seems as if we sample a different point in the wave so this also causes the average values of each channel to change at different rates.

@peekay123, I think this issue hasn't been resolved yet
https://github.com/particle-iot/firmware/issues/1523

Would the non-HAL timer bring more consistency to the measurements you think? Would it eliminate that 5-10 us latency?

@amertz03, you will never get perfect timing on your ADC channels due to the switching and reading overhead. However, on 3.3Hz, the period is 303ms with a half-cycle taking 150ms. It's hard for me to believe you are missing anything over a 25uS time. Perhaps your definition of "peak" needs to be reassessed.

This is most likely due to the fixed timing in your ISR, which is not anchored in the signal itself. You should be looking a points of inflection when you start or stop the emitter since response times will always vary,

@peekay123, by points of inflection, are you suggesting to write a routine that looks at the points and only accepts the top point, or perhaps the top 10 points? Also, I am not understanding why the timing is not anchored to the signal itself. By taking reads exactly 40 ms after I turn the emitter on, shouldn’t I basically be reading the exact the point in the wave every time? Or are you saying that just because I have the timer set to 100 us intervals that it may not be firing at exactly 100 us intervals? Is there a better way to sync the emitter modulation and the ADC reads?

@peekay123, any luck with updating the SparkIntervalTimer library this weekend?

@amertz03, not yet :sweat:

@peekay123, Do you mind providing me with a private email or way to communicate with you off of this forum. We may have some interest in utilizing your services if you are interested.

Thanks

@amertz03, PM me with the details.

@peekay123, Sent.