Neopixel library causes a delay in millis()

Hello,

I am writing a program that tries to measure time while is iterating over some neopixel LEDs attached to a Particle Photon and reading millis() to measure that time, but after being testing and debugging… I realize that the problem is in the function called strip.show() which according to other thread here like millis-and-micros-lose-time-when-interrupts-are-disabled

This is because the strip.show() function disables the interruptions while is executed and this affects the behavior in the millis() function because this function returns time elapsed using interrupts. So this is causing a delay which is making me very hard to measure time.

Do you know, which approaches would I need to do to solve this problem? How can I still use millis() or if I need to use other library to work easily with neopixel LEDs?

I’ve read the documentation on the previous thread that I’ve shared, but I think I am too rookie to understand how to solve it.

So any advice will be appreciated!

Thanks!

You can use System.ticks() which are not affected by interrupts being disabled as they are counted by the controller in hardware not software.

While the docs state that you can use this for timing up to one second, I’d suspect it’s more like 59 seconds. If you need to measure longer periods, you can still use System.ticks() for the precision and millis() for the rough timing.

BTW, how often are you calling strip.show()?
For smooth animations, you’d normally not need to call it more frequently than every 40ms. In any case you’d not want to call it each time you set any single new pixel while you still intend to modify others. You normally update the entire strip data and then call strip.show() once you are done with updating all of them.

Showing your code often helps to get targeted advise.

1 Like

I previously create a code to convert the time from millis() into readable time as a stopwatch where I show this time with the format mm:ss.ms, this readings should get to 5 minutes or 10 minutes as much.

While time is running I am iterating some neopixel LEDs to go through LED 0 to LED 250, I’ve been running it several times and I realize that the neopixel LEDs are running slower than it should. This is where after debugging I’ve found that strip.show() is causing that problem.

I as a user, need to select the time that I want to achieve, and I calculate the time to wait till call strip.show() by doing this: float delayvalmilis = (user_min* 60) + user_sec; and I am comparing two times (actual time and last time) to delay that update task:

/*
    for example: 
    user_min = 1
    user_sec = 30
    delayvalmilis = (1 * 60) + 30
    
    // delay to update LEDs
    delayvalmilis = 90
*/
led_index = 0;
Ptime = millis();

// iterate over the total number of LEDs
for (led_index = LEDS_0 ; led_index < NUM_LEDS;) { 
    // read current time 
    Ntime = millis();
       
    // update the LED light 
    if (Update_Lights) {
           
        //update lights comands
        if(led_index != NUM_LEDS) 
            strip.setPixelColor(led_index + 1, PIXEL_TRAINING);  //turn on 2 light
        strip.setPixelColor(led_index, PIXEL_TRAINING); //turn on 1 light
        if(led_index != LEDS_0)
            strip.setPixelColor(led_index - 1, PIXEL_CLEAR);  //turn off 1 light

        // reset lights
        Update_Lights = false;
    }
        
    // compare time to update lights 
    if (Ntime - Ptime >= (delayvalmilis)) { 
        // update last time 
        Ptime = Ntime;

        //show lights
        strip.show();
        
        led_index++;
        Update_Lights = true;
    }
}

I’ve tried to convert System.ticks() into readable time by doing:

 
uint32_t now_micros = (double)System.ticks() / System.ticksPerMicrosecond();
unsigned long allSeconds = now_micros / 1000000;
int secsRemaining = allSeconds % 3600;
int runMinutes = secsRemaining / 60;
int runSeconds = secsRemaining % 60;
int runMillis = (now_micros / 100000) % 10;

However, I did not found a mathematic way to conver that into readable minutes, seconds and micro seconds because the result is very unnacurate, not even at the first readings, do you know a way to do it? Thanks!

That’s probably due to the fact stated above

Anything over 35.79 seconds the ticks counter will roll over and you need to keep track of the number of rollovers that happened in between and correct for that.

uint32_t can count up to 4.294.967.295. One tick per clock cycle (120MHz) means you can count ~35.79 seconds to count beyond that you’d just use a lower precision counter for the roll-over.

1 Like