Spark-bouncer - Cloud+RFID based door access control - configure rules via cloud calls, handle >3000 keys on flash storage, live logging via publish, RFID OTP read/write/compares, bling!

Dear spark community,

I would love to share my fresh open source Spark based project with you - to invite you to contribute, use, share and love the spark-bouncer!

With big thanks to @mdma for his flashee-eeprom library and the folks in the getting the rfid rc522 to work thread.

spark-bouncer logo


spark-bouncer is a security focused door access control system built on top of the Spark Core platform.

It utilizes the user flash memory to juggle up to 3000 RFID keys. Configuration happens via cloud based function calls.

Security is provided by One-time passwords for each key usage, making your door immune against serial number spoofing attacks.

Your team is allowed to get in early, the crowd a bit later? No worries, the spark-bouncer keeps an eye on precise timing!

You plan to embed a flexible door access control into your existing infrastructure? The spark-bouncer is API driven!

Hook yourself into the live log event stream or query its persistently stored Circular Buffer.

Connect a relay to your electric strike and place a button on the inside to manually open the door, gentleman style.

Buzzing yourself in is just an API call away.


Get started

Breadboard the parts together as described in the header and boot it up!

The code is currently optimized to locally compile outside of the cloud. If you just like to test it without a local environment, flash the included firmware.bin to skip the setup.

If it is the first time you are running the spark-bouncer, your flash memory needs to get initialized:

$ spark call [core-id] reset

Hold a comptabile RFID key to the reader, nothing will happen - yet!
Query the log and store your key’s serial number:

$ spark get [core-id] log
$ spark call [core-id] update aa:bb:cc:dd;*;active,otp

Try your RFID key again - the relay should make a happy noise.

Let’s see what has happened:

$ spark get [core-id] log

After the key wasn’t found in the first place (NOT_FOUND), we updated it (UPDATED) - and granted access at the end (OPEN)!


Bouncer, let me in!

By calling the published open function, you’ll get an instant buzz.


$ spark call [core-id] open

Configure RFID access

The spark-bouncer stores up to 3000 users, each being identified by their 4 to 10 bytes long RFID serial numbers.

Store RFID key

You have to define whom to let in at which time. To do so, call the published update function with following argument:

[key serial];[time restrictions];[flags]

Format used in the fields:

  • key serial - aa:bb:cc[:…] - up to 10 hex values seperated by colons
  • time restrictions
      • -> open at all times
    • - -> never open
    • up to seven 4 byte hex values to define specific valid hours per weekday
  • flags - comma seperated list, set to false if flag not present
    • otp -> enable One Time Passwords for this key [recommended]
    • active -> mark key as active - mandatory for getting in
    • lost -> marks key as lost - won’t get you in anymore
    • reset -> resets the stored OTP in case something went wrong

The call returns

  • 1 if all went well
  • -1 if the key couldn’t get stored


$ spark call [core-id] update "aa:bb:cc:dd;*;active,otp"

Time based access

Each hour of a week day is mapped to a bit in a 4 byte long. Setting a bit to 1 grants access for the corresponding hour.


  • For the time between 16h and 17h, the 16th bit must be set (0x10000).
  • For full day access, set all bits to high (0xFFFFFFFF).
  • Grant access for all of Monday and Sunday, otherwise only buzz in between 16h-17h and 0h-4h on Tuesdays:
$ spark call [core-id] update "aa:bb:cc:dd;FFFFFFFF 1000F 0 0 0 0 FFFFFFFF;active,otp"


Data format

All logging data is returned as a semicolon seperated list. The included elements are:

[timestamp];[key serial];[event code]

Event codes

Code | Event | Triggered when?
0 | NOT_FOUND | scanned RFID key is not stored yet
1 | OPEN | door access granted
2 | OUT_OF_HOURS | valid key but not good for now
3 | DISABLED | usage of a key which is not flagged active
4 | LOST | key is flagged as lost
5 | OTP_MISSMATCH | possible highjack attempted, removes key’s active flag
8 | STORAGE_FULL | very unlikely, but yey, here’s an error in case more than >3000 keys got stored
9 | UPDATED | key data got updated via update call

Subscribing to the live log

The spark-bouncer is publishing all key usages to the Spark Cloud event system as private events.

Example subscription:

$ spark subscribe "" [core-id]

Published events:

  • card - after key handling or updating, data based on data format
  • button - when manual buzzer button is pressed
  • call - when door is opened via the Spark Cloud

Query the most recent events via the cloud

The Spark Cloud allows to query runtime variables with a maximal length of 622.

The spark-bouncer always keeps an internal buffer up to date with the most recent log entries.

Published variables:

  • log - containing as many descendingly ordered data format entries as it can hold.

Example query:

$ spark get [core-id] log


To control the Spark Core’s debug output, call the published debug function with either

  • 1 -> to enable serial debug output, or
  • 0 -> to disable serial debug output

The debug mode can be enabled by default in the top of the code.


$ spark call [core-id] debug 1
$ spark serial monitor
Opening serial monitor for com port: "/dev/cu.usbmodemfa131"
[rfid] identifying f3:65:1d:bc
[flash] Key found, index #0
-- Active? yes
-- Lost? no
-- Times:
          Monday   Tuesday  Wednesday  Thursday   Friday   Saturday   Sunday
 0 h                                      *                               
 1 h                                      *                               
 2 h                                      *                               
 3 h                                      *                               
 4 h                                      *                               
 5 h                                                                      
 6 h                                                                      
 7 h                                                                      
 8 h                                                                      
 9 h        *         *                                                   
10 h        *         *                                                   
11 h        *         *                                       *         * 
12 h        *         *                                       *         * 
13 h        *         *                                       *         * 
14 h        *         *                                       *         * 
15 h        *         *                                       *         * 
16 h        *         *                                       *         * 
17 h        *         *                                      (*)        * 
18 h        *         *                                       *         * 
19 h        *         *                                       *         * 
20 h        *         *         *                                         
21 h                            *                                         
22 h                            *                                         
23 h                            *                                         

-- last update of user configuration: Sat Sep 13 21:32:47 2014
-- last seen: Sat Sep 13 21:44:06 2014

-- OTP:      64 39 2C BC 4A F6 62 04 B1 FF 49 D0 58 2B F4 E3
OTP on Chip: 64 39 2C BC 4A F6 62 04 B1 FF 49 D0 58 2B F4 E3
New OTP:     DA 29 14 1D 37 12 7D 56 04 84 24 A6 49 E0 CA 67
[card] hours match, opening!
[door] opening
[door] closing

Reset storage

Be careful, but if you need to reset your storage during development, call the published reset function. Your spark-bouncer will forget all he knew.

Recommended to disable in production environments.

That’s it :slight_smile: Thanks for your patience and interest.


this is AWESOME. thanks for sharing! want to make a hackster page??


Dude, that’s so RAD! :smile: Congrats on completing a very cool project and thanks for sharing!

The days of one file applications are back as well, haha… pretty sweet man. Even though there’s not much to see, I’d like to see a video of it in action.

Glad you guys like it :wink:

The system is already deployed in a local citizen movement’s HQ with 80 keys circulating, running pretty stable and smooth. We got an Android app to scan the keys (NFC for the win), assign a name + hours to them, store it all in a MongoDB with the server pushing the keys to the spark-bouncer as soon as a user’s data got updated.

Will create a small video of the installation in a few days - look forward to the musical harmony of a buzzing AC electric strike! :dancers: :dancer: :smiley:


Do you have some photos? I’m intending to share about your project so pictures will be awesome! :smiley:


Awww, proud blushing happening here :blush:
Will upload some to this thread in the next 24h, crazily distracted with a beautiful project right now :sweat_smile:

Actually… :slight_smile: We got an introduction for volunteers of our Transition Town Give-away shop featuring it:

Enjoy the German! :smiley: :sunflower:

Will upload some pictures of the tech soon…


Is this project still active?

Thinking of resurrecting my SparkCore for this project and remember how to use it!

Does it play nicely with the cloud compiler “out of the box”?

Does it work if it looses Internet connectivity?

One more question… If I had two separate setups would the same rfid tag work in both or would the OTP then not be in sync?

@gorstj, as @rastapasta points out in the github repo:

The code is currently optimized to locally compile outside of the cloud.

Also, from the looks of it, it may not work without Cloud connectivity as it runs in SYSTEM_MODE(AUTOMATIC) which will “hang” the user app while the firmware attempts to reconnect to the Cloud. The code would have to be modified to take advantage of the SEMI_AUTOMATIC and MANUAL system modes to allow the code to work with Cloud connectivity.

As for the OTP, looking at the code, once a card is scanned, a unique OTP is written to it. If you use the card on another setup, then the OTP will get overwritten. Ideally, you would duplicate the stored key data to the second setup which would then not see the card as new. :smile:

Thanks peekay.

Spark has changed a lot since I used it last… the libraries structure etc. wasn’t there last time I was around!

I think I am having problems with it compiling due to the libraries? When I compile in the WebIDE I get the following error. Any ideas? I have changed the include statements to the following when I added the libraries to the project in the WebIDE:

#include "flashee-eeprom/flashee-eeprom.h"
#include "MFRC522/MFRC522.h"
In file included from ../inc/spark_wiring.h:29:0,
from ../inc/application.h:29,
from MFRC522/MFRC522.h:77,
from MFRC522/MFRC522.cpp:8:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
#warning "Defaulting to Release Build"
MFRC522/MFRC522.cpp: In member function 'bool MFRC522::MIFARE_UnbrickUidSector(bool)':
MFRC522/MFRC522.cpp:1629:1: warning: control reaches end of non-void function [-Wreturn-type]
MFRC522/MFRC522.cpp: In member function 'byte MFRC522::PCD_CommunicateWithPICC(byte, byte, byte*, byte, byte*, byte*, byte*, byte, bool)':
MFRC522/MFRC522.cpp:379:20: warning: '_validBits' may be used uninitialized in this function [-Wmaybe-uninitialized]
if (*backLen < 2 || _validBits != 0) {
MFRC522/MFRC522.cpp: In member function 'void MFRC522::PICC_DumpMifareClassicSectorToSerial(MFRC522::Uid*, MFRC522::MIFARE_Key*, byte)':
MFRC522/MFRC522.cpp:1362:4: warning: 'invertedError' may be used uninitialized in this function [-Wmaybe-uninitialized]
if (invertedError) {
In file included from ../inc/spark_wiring.h:29:0,
from ../inc/application.h:29,
from flashee-eeprom/flashee-eeprom.h:22,
from flashee-eeprom/flashee-eeprom.cpp:17:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
#warning "Defaulting to Release Build"
In file included from ../inc/spark_wiring.h:29:0,
from ../inc/application.h:29,
from spark-bouncer.cpp:33:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
#warning "Defaulting to Release Build"
spark-bouncer.cpp:41:17: error: 'user_t' was not declared in this scope
int checkAccess(user_t &user);
spark-bouncer.cpp:41:25: error: 'user' was not declared in this scope
int checkAccess(user_t &user);
spark-bouncer.cpp:42:15: error: variable or field 'saveUser' declared void
void saveUser(user_t &user, uint16_t keyId);
spark-bouncer.cpp:42:15: error: 'user_t' was not declared in this scope
spark-bouncer.cpp:42:23: error: 'user' was not declared in this scope
void saveUser(user_t &user, uint16_t keyId);
spark-bouncer.cpp:42:38: error: expected primary-expression before 'keyId'
void saveUser(user_t &user, uint16_t keyId);
spark-bouncer.cpp:43:1: error: 'user_t' does not name a type
user_t readUser(uint16_t keyId);
spark-bouncer.cpp:44:15: error: variable or field 'dumpUser' declared void
void dumpUser(user_t &user);
spark-bouncer.cpp:44:15: error: 'user_t' was not declared in this scope
spark-bouncer.cpp:44:23: error: 'user' was not declared in this scope
void dumpUser(user_t &user);
spark-bouncer.cpp:112:25: error: 'int checkAccess(user_t&)' redeclared as different kind of symbol
} user_t;
spark-bouncer.cpp:41:5: error: previous declaration of 'int checkAccess'
int checkAccess(user_t &user);
spark-bouncer.cpp: In function 'void rfidIdentify()':
spark-bouncer.cpp:466:31: error: 'checkAccess' cannot be used as a function
if (debugMode) {
spark-bouncer.cpp: In function 'int checkAccess(user_t&)':
spark-bouncer.cpp:524:29: error: 'int checkAccess(user_t&)' redeclared as different kind of symbol
memcpy(target, buffer, 16);
spark-bouncer.cpp:41:5: error: previous declaration of 'int checkAccess'
int checkAccess(user_t &user);
make: *** [spark-bouncer.o] Error 1

Error: Could not compile. Please review your code.

p.s. does anyone have a the name of the software this forum runs on so I can find some instructions… I can’t seem to be able to quote code without some of the text being huge as above.

Look for discourse forum :wink:

Have a look in this thread

The “huge” font comes from the use of # outside of a dedicated code block.

Very odd… I have got it to compile without modification using the Spark Dev app on a Mac by just adding the libraries into the same folder.

Using the Web IDE creates the errors above… very odd… don’t they both use the same Cloud compiler backend?

Just waiting for my RFID reader to turn up from ebay!

@gorstj, the web IDE requires a different path statement for libraries. Using Spark CLI or DEV leverages the cloud compiler but without the path issues. I pretty well only use CLI/DEV or a local toolchain for compiling now. :smile:

BTW: I totally forgot I have a bunch of these readers and I will be trying out this project as well :stuck_out_tongue:

Just for future information what should the path statement be for the library?

I tried this (as inserted by the web IDE when I added the library):

#include "flashee-eeprom/flashee-eeprom.h"
#include "MFRC522/MFRC522.h"

@gorstj, both these libraries are available on the web IDE so including them will automatically generate the correct #include statement which you can then replace the original (non-working) one with. Nonetheless, you syntax looks correct. :smile:

When you give it a whirl would you try the Web IDE to see if you can get it working.
I’m hoping my RFID readers arriving today.

I am not overly convinced that running my own local cloud will be any more reliable than Spark’s cloud even when accounting for internet outages.

If I use SYSTEM_MODE(SEMI_AUTOMATIC) my understanding is that as soon as I try and connect to the cloud it will lock the user program until it connects? Even SYSTEM_MODE(MANUAL) will lockup e.g. if there is no internet connection?

Received my RC522 and all works good straight away!!

What really surprised me is that if I disconnected my internet the program still ran! I only tested for a couple of minutes so far.

The only problem I have now is that when the Spark Core powers up the I/O ports go transiently high… this would cause my garage door to open! (it only needs a transient completion of the circuit to cause an open/close cycle) Is there any way to stop the I/O transiently going high on powering up?

The only way I could think of was have two relays… one relay would be ‘normally open’ the other ‘normally closed’ and have the garage door wire running through BOTH relays. This way one of the relays has to be powered on and the other powered off for the circuit to complete which shouldn’t happen on powering up the Core. I would need to modify the code to control two output pins accordingly when I did want the door to activate.

Does anyone have any better suggestions?