Cat Live Position Tracker (iBeacon, Data Logger, Battery Powered)

Hello All,

I built a live cat tracker using the Spark Core, Estimote BLE Beacons, and a BLE Mini module from RedBear. This was my first significant Spark project, so I thought I’d share it with you.

GitHub:

Code also embedded at end of post.

I wanted to highlight the community’s efforts that really helped me put this project together quickly. Specifically @krvarma for sharing his Detect iBeacons project and @peekay123 's great hardware timer library.

Unfortunately my friend’s cat didn’t take too kindly to the 50g of electronics and batteries
attached to a 20g harness. I thought 70g would be no big deal, but the cat didn’t like it, so we didn’t proceed. I should have verified that before building the tech. Lol.

While the project’s vision was not fulfilled, I hope that others can pick something up from the project.

I had wanted to have a live (well, 10 min delayed) updating map with the cat’s position as well as show a heat map of where she had been in the last 8 hours. My friend’s apartment has been “mapped” by recording the beacon tx_power and rssi at many locations. These 2 parameters are used to calculate the distance from the beacon. Three Estimote beacons were used, spaced strategically around the apartment. Mapping data is included in the GitHub repo along with a floor plan sketch showing measurement locations.

The setup should run for ~51h hours on 4 AAA batteries (calculation below). It would deep sleep for 50 seconds (~7mA), wake + scan (with Spark.sleep()) for 10 seconds (~70mA),
and every 10 scans it would connect and upload using Spark.publish() (~160mA). All of these parameters are configurable as #defines at the top of the code.

Battery Life Calculation (using lithium AAA batteries (1200 mAh)):

// 7 mA, 50 seconds per run, 10 runs + 70mA, 10 seconds per run, 9 runs, 160 mA, 30 seconds
// for 1 run to upload data
avg mA draw = (75010 + 70109 + 160130) / (5010 + 109 + 1*30) = 23.5 mA

1200 mAh / 23.5mA = 51.1 h

So the lithium AAA batteries I was using should last for a little over 2 days.

Circuit:

  • 4.7k pullup between VIN and RST on BLE module
  • Spark D0 connected to RST on BLE module (so Spark Core can bring device out of reset selectively)
  • RX BLE to TX Spark and vice-versa
  • that’s it! Simple circuit.

Project Features:

  • data logging: deep_sleep, wake, collect_position_data, deep_sleep, upload periodically
  • storing data in flash, using flash as a large circular buffer
  • uploading cached data periodically
  • using hardware timer to trigger sleep
  • uses 3 different power modes (connected, sleep, deep sleep)
  • packing multiple measurements into a single publish update + rate limiting the publishes
  • Serial1 read with timeout
  • Eeprom to store persistent state information

A few issues came up during the project:

  • would run for an hour or so, but would eventually get stuck drawing 160 mA and wouldn’t go to sleep (didn’t debug, implemented workaround for now)
    (fixed by using hardware timer to trigger sleep)
    (is reproducible, so I may debug if I have time in the next few days)
  • some publish events just never arrived (~15%)
    (I’m using the SSEClient python lib, any known issues there?)

Features I wish I had:

  • tell core not to try to use WiFI when it wakes from deep sleep
    (I’d call sleep as soon as I could, but if the initial connection took a bit of time, so that’s just wasted power)
  • SoftwareSerial for debugging
    (Serial1 used by BLE device, couldn’t stay connected to Serial due to frequent deep sleep)
  • Watchdog timer to prevent it getting stuck (workaround implemented, see above)

Code below. Not posted is a python data message parser (mostly implemented) which is in the GitHub repo.

Let me know if there are any questions, comments, or feedback!

Firmware:

#include "SparkIntervalTimer/SparkIntervalTimer.h"

// Timing Parameters - affects power consumption
#define DEEP_SLEEP_TIME_SECONDS (50)
#define PUBLISH_EVERY_X_RUNS (10)

#define SCAN_TIME_ALLOWED_SECONDS (10)

// these are safe guards using hardware timer interrupts for if the
// regular sleep mechanism fails for some reason
#define AWAKE_TIME_ALLOWED_SECONDS (15)
#define PUBLISH_TIME_ALLOWED_SECONDS (20)

// Eeprom Slots
#define SLOT_RUN_NUM (0)
#define SLOT_NUM_READINGS (1)
#define SLOT_CURRENT_SECTOR (2)
#define SLOT_SECTOR_POS (3)


#define FLASH_ADDR (0x80000)
#define SECTOR_SIZE (0x1000) // 4 kB
#define BLE_READING_BUFFER_SIZE (16)

#define JBDEBUG (0)

// BLE Defines
#define BUILD_UINT16(loByte, hiByte) ((uint16_t)(((loByte) & 0x00FF) + (((hiByte) & 0x00FF) << 8)))
#define DEVICE_INITIALIZED 0x600
#define DEVICE_DISCOVERY_DONE 0x601
#define DEVICE_INFORMATION 0x60D

// Globals
static uint8_t buf[64];
static char szInfo[63];
static IntervalTimer sleepTimer;
static unsigned long start;
static uint8_t runNum;

void setup() {
    start = millis();
    
    runNum = EEPROM.read(SLOT_RUN_NUM);
    EEPROM.write(SLOT_RUN_NUM, runNum + 1);
    
    // Every X runs, don't go to sleep, so we can publish the results
    if (runNum % PUBLISH_EVERY_X_RUNS != 0) {
        Spark.sleep(1000); // don't wake up!
    } 
    
    sleepTimer.begin(sleepCallback, AWAKE_TIME_ALLOWED_SECONDS * 1000 * 2, hmSec);
    
    sFLASH_Init();
    
    Serial1.begin(57600);
    
    // Pull RST on BLE module low so module powers up
    pinMode(D0, OUTPUT);
    digitalWrite(D0, LOW);
    delay(100); //JBTODO: needed?
    
    #if JBDEBUG
    Spark.publish("Setup!", "Hello World");
    delay(1000);
    publishCachedData();
    #endif
    
    // Initialize ble mini
    hci_init();    
}

void loop(){ 
    if (Serial1.available()){
        ble_event_process();
    }
    if ((millis() - start) > (SCAN_TIME_ALLOWED_SECONDS * 1000L)) {
        publishAndSleep();
    }
}

static uint8_t timesCalled = 0;
// Callback for Timer 1
void sleepCallback(void) {
    ++timesCalled;
    // Timer is called immediately on setup, so need to wait for the second call
    if (timesCalled > 1) {
        Spark.sleep(SLEEP_MODE_DEEP, DEEP_SLEEP_TIME_SECONDS);
    }
}

void publishAndSleep()
{
    if (runNum % PUBLISH_EVERY_X_RUNS == 0 && Spark.connected()) {
        publishCachedData();
    }
    Spark.sleep(SLEEP_MODE_DEEP, DEEP_SLEEP_TIME_SECONDS);
}

void publishCachedData()
{
    uint8_t numReadings = EEPROM.read(SLOT_NUM_READINGS);
    //if (numReadings == 0) return; // Uncomment to skip publishing if no readings

    // Update sleep timer backup
    // We go to sleep when finished publishing ... but if we get stuck, this should kick in instead
    sleepTimer.resetPeriod_SIT(PUBLISH_TIME_ALLOWED_SECONDS * 1000 * 2, hmSec);

    // Clear readings
    EEPROM.write(SLOT_NUM_READINGS, 0);

    uint8_t sectorPos = EEPROM.read(SLOT_SECTOR_POS);
    uint8_t currentSector = EEPROM.read(SLOT_CURRENT_SECTOR);
    
    uint8_t buffer[BLE_READING_BUFFER_SIZE*4];
    
    // 256 slots (8 bytes each), using all 256 sectors in total in this circular buffer
    uint32_t addr = (FLASH_ADDR + (currentSector * SECTOR_SIZE) + (sectorPos * BLE_READING_BUFFER_SIZE) - (numReadings * BLE_READING_BUFFER_SIZE)) % 0x200000; // mod end of external flash
    
    sprintf(szInfo, "Run: %d NumReadings: %d addr %x", runNum, numReadings, addr);
    Spark.publish("device_readings", szInfo);

    uint8_t eventsSent = 0;
    
    for (uint8_t i = 0; i < numReadings; i += 4) {
        if (eventsSent++ % 3 == 0) delay(1500); // delay every 3 msgs
        sFLASH_ReadBuffer(buffer, addr, sizeof(buffer));
        // String containing: beacon id (2 digit) - run number (1 digit) - tx power (3 digit) - rssi (3 digit) - seconds into run (1 digit)
        // Get 4 readings into 1 publish event
        // Burst rate is 4 per second with a max of 60 per minute
        sprintf(szInfo, "%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d-%d", 
            buffer[0]%100,buffer[1]%10,buffer[2],buffer[3],buffer[4]%10,buffer[16]%100,buffer[17]%10,buffer[18],buffer[19],buffer[20]%10,
            buffer[32]%100,buffer[33]%10,buffer[34],buffer[35],buffer[36]%10,buffer[48]%100,buffer[49]%10,buffer[50],buffer[51],buffer[52]%10); 
        addr += BLE_READING_BUFFER_SIZE * 4;
        addr = addr % 0x200000;
        Spark.publish("beacon", szInfo);
    }
    delay(1000);
}

// BLE Event Processing
byte ble_event_process(){
    uint8_t type, event_code, data_len, status1;
    uint16_t event;
    bool timedOut;
    
    type = serialReadWithTimeout(timedOut);
    if (timedOut) return 0x00;
    
    event_code = serialReadWithTimeout(timedOut);
    if (timedOut) return 0x00;
    
    data_len = serialReadWithTimeout(timedOut);
    if (timedOut) return 0x00;
  
    for (int i = 0; i < data_len; i++) {
        buf[i] = serialReadWithTimeout(timedOut);
        if (timedOut) return 0x00;
    }
    
    event = BUILD_UINT16(buf[0], buf[1]);
    status1 = buf[2];
    
    switch(event){
        case DEVICE_INITIALIZED:{
            //Serial.write("DEVICE_INITIALIZED\n");
            #if JBDEBUG
            delay(1000);
            sprintf(szInfo, "%d Discovery Start! millis %ld seconds %ld", runNum, millis(), millis() / 1000L);            
            Spark.publish("time", szInfo);            
            #endif
            hci_start_discovery();
            break;
        }
        case DEVICE_DISCOVERY_DONE:{
            //Serial.write("DEVICE_DISCOVERY_DONE\n");
            #if JBDEBUG
            delay(1000);
            sprintf(szInfo, "%d Discovery Done, sleeping - millis %ld seconds %ld", runNum, millis(), millis() / 1000L);            
            Spark.publish("time", szInfo);   
            #endif
            //publishAndSleep();
            break;
        }
        case DEVICE_INFORMATION:{
            // Get RSSI and Measured Power
            uint8_t rssi = buf[11];
            uint8_t txpower = buf[42];
            uint8_t beaconID = buf[38];
            uint8_t seconds = (millis() / 1000L);
            
            if (beaconID == 0xFF) break; // filter weird beacon readings
            
            logData(beaconID, runNum, txpower, rssi, seconds);
            break;
        }
        default:
        //Serial.write("unkown cmd\n");

        break;
    }
}

void freshStart()
{
    EEPROM.write(SLOT_RUN_NUM, 0);
    EEPROM.write(SLOT_NUM_READINGS, 0);
    EEPROM.write(SLOT_CURRENT_SECTOR, 0);
    EEPROM.write(SLOT_SECTOR_POS, 0);    
}

void logData(uint8_t id, uint8_t run, uint8_t tx, uint8_t rssi, uint8_t seconds)
{
    // Update number of readings
    uint8_t numReadings = EEPROM.read(SLOT_NUM_READINGS);
    EEPROM.write(SLOT_NUM_READINGS, numReadings + 1);
    
    uint8_t sectorPos = EEPROM.read(SLOT_SECTOR_POS);
    uint8_t currentSector = EEPROM.read(SLOT_CURRENT_SECTOR);
    
    if (sectorPos == 0) {
        sFLASH_EraseSector(FLASH_ADDR + currentSector * SECTOR_SIZE);
        delay(20);
    }
    
    uint8_t buffer[BLE_READING_BUFFER_SIZE];
    buffer[0] = id;
    buffer[1] = run;
    buffer[2] = tx;
    buffer[3] = rssi;
    buffer[4] = seconds;
    
    // 255 slots (16 bytes each), using 255 sectors in total in this circular buffer
    sFLASH_WriteBuffer(buffer, FLASH_ADDR + (currentSector * SECTOR_SIZE) + (sectorPos * BLE_READING_BUFFER_SIZE), BLE_READING_BUFFER_SIZE);
    
    // Update sectorPos and currentSector if we wrap around
    ++sectorPos;
    EEPROM.write(SLOT_SECTOR_POS, sectorPos);
    if (sectorPos == 0) {
        EEPROM.write(SLOT_CURRENT_SECTOR, currentSector + 1);
    }
}

uint8_t serialReadWithTimeout(bool& timedOut) 
{
    timedOut = false;
    uint8_t timeoutMs = 100;
    unsigned long start = millis();
    while(!Serial1.available()) {
        if (millis() - start > timeoutMs) {
            timedOut = true;
            return 0x00;
        }
    }
    return Serial1.read();
}

// BLE Stuff ...
#define GAP_PROFILE_CENTRAL           0x08
#define KEYLEN                        16

static uint8_t gapCentralRoleTaskId = 0;
static uint8_t  gapCentralRoleIRK[KEYLEN] = {0};
static uint8_t  gapCentralRoleSRK[KEYLEN] = {0};
static uint32_t gapCentralRoleSignCounter = 1;
static uint8_t  gapCentralRoleMaxScanRes = 5;


int hci_init()
{
    return GAP_DeviceInit(gapCentralRoleTaskId, GAP_PROFILE_CENTRAL, gapCentralRoleMaxScanRes, gapCentralRoleIRK, gapCentralRoleSRK, &gapCentralRoleSignCounter);
}

int hci_start_discovery(){
    return GAP_DeviceDiscoveryRequest();
}

// Send initialize HCI command
int GAP_DeviceInit(uint8_t taskID, uint8_t profileRole, uint8_t maxScanResponses, uint8_t *pIRK, uint8_t *pSRK, uint32_t *pSignCounter){
    uint8_t len = 0;
    
    buf[len++] = 0x01;                  // -Type    : 0x01 (Command)
    buf[len++] = 0x00;                  // -Opcode  : 0xFE00 (GAP_DeviceInit)
    buf[len++] = 0xFE;
  
    buf[len++] = 0x26;                  // -Data Length
    buf[len++] = profileRole;           //  Profile Role
    buf[len++] = maxScanResponses;      //  MaxScanRsps
    memcpy(&buf[len], pIRK, 16);        //  IRK
    len += 16;
    memcpy(&buf[len], pSRK, 16);        //  SRK
    len += 16;
    memcpy(&buf[len], pSignCounter, 4); //  SignCounter
    len += 4;

    Serial1.write(buf, len);

    return 1;
}

// Send start discovery request
int GAP_DeviceDiscoveryRequest(){
    uint8_t len = 0;
    
    buf[len++] = 0x01;                 // -Type    : 0x01 (Command)
    buf[len++] = 0x04;                 // -Opcode  : 0xFE04 (GAP_DeviceDiscoveryRequest)
    buf[len++] = 0xFE;
        
    buf[len++] = 0x03;                 // -Data Length
    buf[len++] = 0x03;                 //  Mode
    buf[len++] = 0x01;                 //  ActiveScan
    buf[len++] = 0x00;                 //  WhiteList
  
    Serial1.write(buf, len);
  
    return 1;
}

Python to collect data:

from sseclient import SSEClient

deviceID = "XXX"
accessToken = "XXX"

messages = SSEClient('https://api.spark.io/v1/devices/' + deviceID + '/events/?access_token=' + accessToken)

for msg in messages:
    print(msg)
5 Likes

Great Project @jbennett, Thank you for sharing!

Thats sooooo cool. Too bad the cat doesn’t like to wear it because a “version 2” could have used a gps to track his outdoor expeditions as well :smiley:

awesome project, thanks for sharing, and thanks for the great feedback! :slight_smile:

I am trying to build a “local” tracking system so I was wondering if I can use the current hardware from this project and use the RSSi to determine distance? Basically your phone would have an app or access a webpage that would show the status of the ‘dongles’ (which would be a SparkCore and RedBear BLE unit?) and maybe every so many seconds the SparkCore would read the RSSi signal and if the distance is too far trigger an alert to the phone? Or maybe the SC and RedBear would have to send the RSSi out every so often to the phone and let the phone (or webpage) decide if the distance is too far? Does this make sense to anyone?

As a total beginner I am really having trouble trying to figure out what component would go where. I also wonder if BLE would have enough range for this project and if it would be reliable enough to track a pet in an indoor area of a 100 meters or so (like say in a shopping mall, etc).

I also am confused about class 1 and 2 and can’t seem to figure out if BLE has class 1 and 2 or if it is just the classic bluetooth which has 2 classes. It seems like I should have a class 1 device because of the extra range.

Any ideas, suggestions or jokes welcome :wink:

@yugnats, you can calculate the distance from RSSI, I have done the same in my project using Spark Core and iBeacon (https://community.spark.io/t/detect-ibeacons-using-spark-core-and-ble-mini/5554). What you have to do is send HCI Device Discover command and wait for Device Information, once the Device Information is received the 11th byte of the packet will be RSSI and we can calculate the distance using formula described in this link (http://stackoverflow.com/questions/20416218/understanding-ibeacon-distancing). For a full explanation refer to my project link above.

Following code snippet from my project calculates the distance:

// Get RSSI and Measured Power
            int rssi = buf[11];
            int txpower = buf[42];
            
            // Calculate Distance
            // This is based on the algorithm from http://stackoverflow.com/questions/20416218/understanding-ibeacon-distancing
            //
            double distance = 0.0;
            double ratio = (256 - rssi) * 1.0 / (256 - txpower);
              
            if(ratio < 1.0)
                distance = pow(ratio, 10);
            else
                distance = (0.89976)*pow(ratio,7.7095) + 0.111;

Hope this will help you to get started.

Thank you very much I really appreciate it. If I run into any troubles I suppose I should post on your SC / RedBear project thread.

The issue was that the flash read/write functions (i.e., sFLASH_ReadBuffer() / sFLASH_WriteBuffer()) were causing a crash.

@bko mentions in this thread that wrapping the function calls with delay(20)s improves stability. I haven't had any issues since doing so. I'll update the code in the repo shortly.

Hi @jbennett

There is a real fix for the external flash problems is coming soon in the form of a software arbiter that allows the SPI bus shared between the external flash and the TI CC3000 to co-exist better. I think this currently waiting for some folks to come back from Summer holiday, but it is coming.

I think it’s lined up for release next week. @zachary, is that correct?

1 Like

That is correct. SPI bus arbitration is in core-firmware master now (it was required for deep update) and will be released to the web IDE this coming week, no later than Wednesday evening.

And @jbennett, regarding this:

Also in this week's release will be code for our controlling the connection feature, which will make it easy to immediately run setup() and loop() without connecting to the Spark Cloud. All you'll have to do is declare SYSTEM_MODE(SEMI_AUTOMATIC). Then, when you actually want to connect, you have to call Spark.connect(). Your code can branch around Spark.connected() like this:

if (Spark.connected()) {
  // publish, etc.
}
else {
  // offline work or just fall through waiting for connection
}
2 Likes