How to put spark core to sleep and wakeup on interrupt signal on a pin?

@Moors7, that is a superb idea and I think it is doable. Of course, @satishgn has the final say on that!

3 Likes

@Moors7, @peekay123, Great Idea! I will add one more overloaded method:
Spark.sleep(uint16_t wakeUpPin, uint16_t edgeTriggerMode, long seconds);
Thanks for the cool use-case

4 Likes

Ok, we have now 2 methods:

Spark.sleep(uint16_t wakeUpPin, uint16_t edgeTriggerMode);
Spark.sleep(uint16_t wakeUpPin, uint16_t edgeTriggerMode, long seconds);

Also @peekay123, I have removed the second reset after wakeup from sleep and instead just re-enabled the system clocks(HSE & PLL).

The latest commit is over here:

8 Likes

Does the stop mode retain all the system variables when woken up? Or does it function in the same way as the deep sleep mode where the system restarts?

Thanks

I have tried out a minimal test of this new sleep code:

#include "application.h"

int led2 = D1;
int intPin = D0;
SYSTEM_MODE(MANUAL);

void setup()
{
  pinMode(led2, OUTPUT);
  Spark.sleep(intPin, FALLING, 60);
}
void loop()
{
  digitalWrite(led2, HIGH);
  delay(1000);
  digitalWrite(led2, LOW); 
 }

locally compiled and flashed with the new firmware and my core simply flashes green 3 times fast then white once long and immediately repeats. What am I doing wrong?

1 Like

Hi @bpr,

I guess you are seeing this problem because of IWDG reset which was enabled in Bootloader in new versions of Core(Run 2 aand Run 3).

I made an update to Bootloader today to “NOT” enable IWDG when the Stop Mode Flag is Set.
Following is the commit: spark/bootloader@e5792b1

I also created a Bootloader-patch firmware which will easily enable anyone in field to update to the latest bootloader without using JTAG programmer. Here is the core-firmware branch for this:


However care should be taken not to disturb the patch update process by accidentally removing usb power or through reset.
Following is the procedure to update the bootloader-patch
1)cd core-firmware
2)git pull
3)git checkout bootloader-patch-update
4)cd build [No need to run make]
5)Enter usb dfu/bootloader mode => Yellow flashes
6) dfu-util -d 1d50:607f -a 0 -s 0x08005000:leave -D core-firmware.bin
7)Upon execution, the bootloader should be updated within a second or two and normal core-firmware should start executing.
8)Congrats! your core is loaded with latest bootloader without the hassle of going through ST-Link JTAG programming

For those with Run2 and Run3 production Cores, the above mentioned process should get your “wakeup from sleep with interrupt” working i.e. IWDG should not reset after 5 seconds.

Be Brave in trying out on one of your Core :smile:

I tried your above mentioned code and it worked fine after the Bootloader fix.

Thanks!
Satish

6 Likes

Wonderful!!! That got it working. Thanks! But I don’t understand the behavior totally. When Spark.sleep(intPin, FALLING, 60); is in setup(), the loop is never run. If I put it in loop(), loop code doesn’t run unless sleep is placed after all other code. I don’t understand why.

The core will always restart after waking up and the first section of code to run is setup() before loop(). :wink:

2 Likes

We are resetting now before entering Stop mode triggered by Spark.sleep because by default IWDG is enabled. Watchdog and sleep can’t work together and there is no way to disable IWDG except reset.

Probably we should have this now(proposal):
1)For always on Core(i.e. no Spark.sleep in application code), enable IWDG by default
2)For Core that sleeps based on user application(typically battery powered applicaton), do not enable IWDG. This will make full use of stop-mode, no reset for entering stop mode, state variable retention, execute from the same place where it last went to sleep.

5 Likes

Hey, great topic guys. I am new to the Spark and and just wondering, which pins are used for the interrupts? I know Arduinos are D2 an D3. Is it the same for these units? Thanks!

@mrmills129, the answer to your question is already found in the documentation here:

http://docs.spark.io/firmware/#spark-sleep :wink:

2 Likes

@kennethlimcp Thanks! I looked through those docs but was expecting it to be detailed in the hardware section on the pinout diagram. Thanks for taking the time!

This is a newly added feature of around 1 month old so its not available in the pin-outs :wink:

I am very interested in this feature - but having some issues getting through all the steps to get the modified bootloader on my Spark Core. Is there a prebuilt bootloader that I can transfer, rather than completely rebuilding?

Look for the branch and the .bin file is in the build folder.

Sorry but I’m on the go so probably drop the links here later if you can’t find it. :slight_smile:

I think I simply used https://github.com/spark/bootloader/tree/master/build and the instructions above (satishgn 21 Aug)

Thanks for the help everybody - I was able to get the firmware loaded on the Spark Core. It appears that both the RISING and FALLING modes work with the sleep mode, however the CHANGE doesn’t (it just reboots). Has anybody used the CHANGE mode?

I have just tried to use the new sleep interrupt modes and got the same results: RISING and FALLING modes work fine but the CHANGE mode keeps rebooting the core. I am using the sleep/interrupt mode for a Home Automation magnetic contact detector (battery powered, for intrusion detection) and actually need the CHANGE mode, since it is required to detect both “door open” (FALLING) and “door closed” (RISING) states.

Hello guys!

I missed this thread and didn’t work on it but guess that there might be something going on due to attachInterrupt(pin, function, mode);

http://docs.spark.io/firmware/#interrupts-attachinterrupt

Can you try a simple code with this function and on your application and observe the behavior?

I can probably hook up a scope and see how this goes in 2 hours :smiley:

I have an theory on what might be happening - I used the internal pull-up resistor on my input. Is is possible that that is floating and causing a false trigger? I will test is later this week - just haven’t found time to test it so far…