It sounds like there are some misc bugs in handling Serial1 in DeviceOS. I am experiencing some intermittent problems, still sorting it out.
I’m asking as a general question…
If a packet is arriving on Serial1, and let’s say it is 10 or 30 bytes long or something. Then let’s say I poll Serial1.available() and it reads when only 5 bytes have arrived, then I go start reading those into a local array by looping Serial1.read(), but at the same time, other bytes are still arriving… does this cause a problem?
Generally I would expect this is safe, but if there is a chance the packet could be corrupted I’ll need another strategy to make sure the entire packet has arrived before reading any of it. The packets will always be less than the 64 max.
I will not be sending anything out the Serial1 transmit during this time if that plays into it. (I think I’ve seen notes where the two buffers can become corrupted with each other).
Second side question… if loop() is blocked by a delay(), will received Serial1 bytes still arrive to the internal buffer in the background? I tested this briefly and it looks like they did arrive, but the function was still somewhat buggy for other reasons, wasn’t sure if this could be causing it.
Thanks.