Oh right interrupt priority, missed that bit in the original post.
Doesn’t look like there’s any function to do that in the Spark library, here’s the relevant code from spark_wiring_interrupts.cpp, it looks like the interrupt priority is hardcoded based on the pin number:
if(GPIO_PinSource < 10)//Should not try changing the priority of EXTI15_10_IRQn
//select NVIC channel to configure
NVIC_InitStructure.NVIC_IRQChannel = GPIO_IRQn[GPIO_PinSource];
if(GPIO_PinSource > 4)
NVIC_InitStructure.NVIC_IRQChannelPreemptionPriority = 14;
NVIC_InitStructure.NVIC_IRQChannelPreemptionPriority = 13;
NVIC_InitStructure.NVIC_IRQChannelSubPriority = 0;
//enable IRQ channel
NVIC_InitStructure.NVIC_IRQChannelCmd = ENABLE;
//update NVIC registers
The NVIC functions are all part of the chip’s standard peripheral driver library, which I’ve always found somewhat confusing to use, but you may be able to re-initialize the priority levels using the NVIC functionality yourself after calling Spark’s attachInterrupt().
This site has some auto-generated documentation based on the NVIC functions’ source files, hopefully it will at least get you started.
Edit: the site linked above shows the documentation for the STM32 F4 series chips, whereas the SparkCore uses F1 series, so you may want to poke around a bit more to see if you can find documentation for the F1. But I doubt it will be significantly different, if at all.