Asynchronous send

I have project where I need to count incoming pulses, from several per second to 1 per minute.
I figure that I will use an interrupt whenever a pulse comes in.
Every 5 minutes, it would then upload the pulse count to a server.
Is there a way to have the Spark upload the data while still continuing to count, so no pulses are missed during the upload time?

1 Like

Interrupts should override any loops that are running but I don’t know if they have higher priority than the spark functions.

Edit : could not find any additional information in the docs. Perhaps @Dave can help (or somebody else but dave comes to mind :wink: )


@Awake, user interrupts are not higher level than “core” interrupts But if you keep your ISR code short then you won’t have any problems :smile:


Yep, interrupts should work fine! :slight_smile: Just declare your counter volatile and increment the counter from the interrupt. Then in the main loop, send the counter value every few minutes.

1 Like