I am trying to use a delay in a thread but have found that the delay() function and os_thread_delay_until() both do not seem to actually delay the thread. The thread just keeps running as if nothing happened. I created my own delay function that does actually delay the thread but also causes the main application thread to hang. Any ideas for a better solution?
This is my delay function:
void Thread_delay_func( system_tick_t ms )
{
system_tick_t start = millis();
while ( millis() - start < ms ){
os_thread_yield();
}
}
@RKreinin take a look at @rickkas7 coverage of Particle threads for answers. He uses os_thread_delay_until() so I’m not sure why it wouldn’t work for you.
I should mention that os_thread_delay_until() is working fine in my threads main loop for making the whole thread run at a certain rate but if I try using it in the delay function like this:
it doesn’t work. The start variable will be updated with the correct value but if i check millis() before and after the os_thread_delay_until call, there wont be a difference of say 100ms if im using a 100ms delay, it will be either 0 or 1ms difference instead.
@RKreinin, it most likely doesn’t work due to the local vs global declaration of the start variable. Looking at @rickkas7’s examples, he declares the system_tick_t var globally.
@RKreinin, looking at the code Particle code for os_thread_delay_until() and the underlying FreeRTOS call to vTaskDelayUntil(), the &start pointer parameter is set automatically! So don’t set start to any value prior to call os_thread_delay_until().
Right, but then the value of start will be the time at which the last delay was called, so if i want to delay by 1000ms, it will only delay until 1000ms has passed since the last time the delay was called, rather than 1000ms since the current delay was called. Thats why I set start to millis(), so that it delays for ms milliseconds from when the Thread_delay_func is called.
I just spun-up a thread and added the following to the thread func.
while(true)
{
Log.info("delay in the thread");
delay(1000);
}
See the following output snippet showing 1000ms delays as expected
0000015987 [app] INFO: delay in the thread
0000016987 [app] INFO: delay in the thread
0000017987 [app] INFO: delay in the thread
0000018987 [app] INFO: delay in the thread
0000019987 [app] INFO: delay in the thread
Perhaps make sure you are on a current version of Device-OS and provide more details on how you are instantiating your thread and how you are using the delay in-line.
My thread is essentially being used to read sensor readings which requires specific timing for writing and reading to and from the sensor. I start the thread with Thread("Sensor Thread", sensorThreadMain),
with
In Read_Sensors() there are several calls to a delay function with delays ranging from 1ms to 1000ms. Oddly enough, if I use the delay() function, it will sometimes work, and sometimes just skip over it. Here is an example log:
02:49:06:0158 []: 100ms delay start
02:49:06:0258 []: 100ms delay end
02:49:06:0267 []: 1ms delay start
02:49:06:0267 []: 1ms delay end
02:49:06:0277 []: 10ms delay start
02:49:06:0277 []: 10ms delay end
02:49:06:0292 []: 1000ms delay start
02:49:06:0292 []: 1000ms delay end
02:49:06:0302 []: 1ms delay start
02:49:06:0302 []: 1ms delay end
02:49:06:0312 []: 10ms delay start
02:49:06:0312 []: 10ms delay end
02:49:06:0318 []: 100ms delay start
02:49:06:0418 []: 100ms delay end
It seems that only the 100ms delays are actually working? I’m not sure why this is happening.
So it looks like the issue is that those delays that were not working were being called from a SINGLE_THREADED_BLOCK, which seems to not allow the delays. Any workarounds to this?