#Include CapacitiveSensor library

Hi all,

I want to use the CapacitiveSensor.h library to make an interactive mural using bare conductive paint as a sensor. I tested the library on my Arduino and it worked fine but my programming skills are not good enough to figure out how I can get it working with my Spark Core.

Is there anybody that can tell me how I can get this working?

1 Like

marcapon, based on the bill of materials, the Spark uses the STM32F103CBT6 processor. The “103” describes:

103 = Cortex-M3 mainstream, 72MHz CPU, up to 1MB flash with motor control, USB and CAN

This processor does not have capacitive touch input compatible pins. To do capacitive touch, you could add a touch sensor breakout board like this one from Adafruit which can work with I2C or SPI and has 8 inputs. The I2C mode uses only 2 wires and I could help you adapt the arduino library. :smile:

Actually, I’m pretty sure the Arduino doesn’t have true touch-sensitive input pins either… That’s the point of the CapactiveSensor Library, it uses two input pins along with a massive resistor, along with the millis() function to time/measure the capacitance on a specified pin. I’m using it in a project right now actually, (using a 25M ohm resistor). So… I feel like the library should work on the Spark Core, because it doesn’t depend on built-in capacitive sensing ability.

Seems like porting over isn’t difficult for this with no timers and interrupts in the way.

Would you like me to give it a try? :slight_smile:

EDIT: Some arduino specific code to be worked on and we are good to go…

@david_s5 i need some inputs from you or anyone else on how to read the system clock freq.

Arduino uses ‘F_CPU’ and i found SYSCLK_FREQ_24MHz for the STM32.

Is there any chance i can call it directly or there’s another variable for the System clock?

kennethlimpcp, the system clock define you are looking for is SYSCLK_FREQ_72MHz and it is defined as:

#define SYSCLK_FREQ_72MHz  72000000


Is there something i must #include to use it?

Using it directly in the .h file didn’t worked. What if the user changes the freq in future? This value doesn’t change right?

Wondering if there is a F_CPU replacement for the spark core :smile:

Thanks @peekay123!

kennethlimcp, I believe SystemCoreClock contains the clock frequency value. It is a uint32_t variable.

Thanks! worked perfectly :smile:

Just left the porting over of the specific functions!

This 4 arduino functions are hard to tackle!

They are like macros in Arduino.h and currently i cannot find the same function on the core-firmware

sBit =  digitalPinToBitMask(sendPin);			
sPort = digitalPinToPort(sendPin);
sReg = portModeRegister(sPort);
sOut = portOutputRegister(sPort);


I chopped these out of some library recently with success… let me see if I can find that again…

kennethlimpcp, BDub, these are the replacement commands:

PIN_MAP[PIN].gpio_peripheral->BSRR = PIN_MAP[PIN].gpio_pin; // PIN High

PIN_MAP[PIN].gpio_peripheral->BRR = PIN_MAP[PIN].gpio_pin;  // PIN Low


Thanks! but that’s like 1/4 solved hahahaha.

Been digging up on the firmware files to find the equivalent for the rest…

@peekay123 Yes you need those and more stuff… to set the pin direction, pull-ups or not… etc… For reading inputs you do it a little differently.

These are basically direct port manipulation. I have some examples of that here on the Spark Core: https://gist.github.com/technobly/8342185 and here: https://gist.github.com/technobly/8573877

Looking at the library, it appears they are using the direct port manipulation on the Arduino to gain speed in setting the pins to various states. And waiting for the capacitance on the input to do it thing, charging/discharging. They are using a tight loop with a counter to determine some fast timing/calibration of the these inputs.

Really to be able to convert this, you are going to need to understand how it works on the Arduino first. Then you’ll have to determine the best way to do it on the Core. You might not have to use direct port manipulation… and you might be able to get away with digitalWrite, pinMode, etc… it’s unclear until their timing is understood.

This is obviously going to need to be changed:
loopTimingFactor = 310; // determined empirically - a hack

But to what? You won’t know until you see what the timing is on an Arduino. Got a scope or a logic analyzer?

@kennethlimcp also take a look at the Spark Wiring for how stuff is implemented low level:

@BDub thanks for so much information!

I looked at the code that requires the portregister/portoutputregister etc and was thinking we can get away by using the functions to directly set the pins to high/low and their modes like PULLUP/PULLDOWN etc. like you mentioned

But i can’t tell if that’s the best way to port the library over?

I don’t own a scope but it’s available in my university :slight_smile:

Just trying to help out with my limited knowledge and learn along the way :spark:

OK, so I didn’t have enough coffee this morning and I missed some stuff! Ok, a lot of stuff!! :open_mouth:

1 Like

kennethlimcp, I looked at Paul Stoffregen’s code for the capacitance sensor and I believe he uses direct port manipulation to charge/discharge the intrinsic capacitance of the pins as quickly as possible. Is it necessary? The code was originally written for a 16MHz arduino so at 72MHz, the standard Spark library commands may work just fine.

I believe the loopTimingFactor value is based entirely on the capacitance of the Spark pins while in input mode. It affects the timeout of the sensor sampling loops. So if a sensor INPUT is not changing from low to high (ie. charging due to touch) then the loop times out and returns an error. The timeout var is called CS_Timeout_Millis but its value for F_CPU of 16000000 is 620,000! Looking at the code, it is a tight loop so the timeout must reflect the while() processing time * number of loops to get a delay of N milliseconds, where N is not explained by Paul. So you may have to increase that delay given the Spark processor speed, thus the “hack”. :smile:

@peekay123 That’s a good point to note!

I guess that’s gonna be a factor once i can get the code running the way it should (ie. changing from high to low and pullup/pulldown)

I’ll see how i can be change those stuff without touching the registers yet behaving the same way.

Really thankful for all the inputs! :smiley:

To me it seems wrong that it would take many many milliseconds to charge and discharge the parasitic capacitance on the inputs/flying wires… but the millis() does work it’s way into the loop max timing so it’s somewhere in the millisecond range for a timeout.

That said, you should play with capturing the timing of the functions on an Arduino to understand where everything stands.

BTW the digitalWrite() takes about 2us to execute on the Spark Core. I wanna say the last time I checked on the Arduino is was in the 5-6us range?

Don’t forget we have the micros() timer. It might also be possible to do something with attachInterrupt() but it’s limited to only so many pins.

BDub, the timeout loop is very tight and I suspect it takes about 15us or so. So with the numbers as they are, the timeout is 620,000 * 15us = 9ms, so it appears the timeout is about 10ms. It does seem long but, like I pointed out, the capacitance of the input pin is the key here. The point is that WITH a touch, it should never get there. I would use the micros() delay as you mentioned and use the 10ms timeout to start with.