Testing IRQs on the Spark core

I’m posting some code based on a conversation in the #spark channel on freenode.
I’d done some playing around with the interrupts and had some issues with a button.
It turns out that issue was the buttons.
The following code shows that the interrupt system works just fine.

int irqCount;
int setCount;
void irq (void);

void setup() {

    Spark.variable ("irqCount", &irqCount, INT);
    Spark.variable ("setCount", &setCount, INT);

    pinMode (D0, OUTPUT);
    pinMode (D1, INPUT);
    pinMode (D2, INPUT);
    pinMode (D7, OUTPUT);
    
    attachInterrupt (D3, irq, FALLING);
    
    digitalWrite (D0, HIGH);
    
    irqCount = 0;
    setCount = 0;
}

void loop() {

    if (HIGH == digitalRead (D1))
    {
        digitalWrite (D7, HIGH);
        digitalWrite (D0, LOW);
        setCount++;
        digitalWrite (D0, HIGH);
    }
    else
    {
        digitalWrite (D7, LOW);
    }
    
    if (HIGH == digitalRead (D2))
    {
        digitalWrite (D7, HIGH);
        irqCount = 0;
        setCount = 0;
    }
}

void irq (void)
{
    irqCount++;
}

Process:

Pull D1 high for some period of time.
Read irqCount and setCount. They should be the same.

In my tests, they were the same, every time.

Just thought that might help.

1 Like

Thanks @mgssnr for reporting back!

Ditto, I have been able to successfully use Interrupts on the :spark:. I have an DHT11 working in a non-blocking mode using interrupts.