How to make a spark toy car heel like a dog?

I want to make my working internet Rover follow, as if it were a dog, “something” like a cell phone, but I need few more ideas for the best method.

I could over the cloud log the GPS of the cell phone and then compare my Rovers GPS location and have it turn accordingly, but I want something more responsive. Perhaps communicate the GPS through IR, Bluetooth or Wifi. What would be cool is if it could in real time follow a .

Another idea would be to follow a specific ultrasonic frequency (Bad idea: follow an Alpha particle emitter), or the Wifi signal from the cell phone. The rover would only need to get distance and direction from the device, may need two or three sensors. I want the Rover to basically “heel” behind the owner. I haven’t got this far by using complex ideas, I need something simple!

You guys got any other ideas, be as creative as you can.

Take a look at these things, they’re apparently made for things like this:

Also, doing a quick Google search never hurts :wink:


Wow. That was fast. Great idea. Will look into that.

1 Like

PS. I Google search for hours every day :>

With a bit of research seems a bit finicky for the Spark Core

You have to constantly rotate the sensor, with a servo, back and forth to get a digital reading that translates to a direction without giving any distance to the object information. The robot is going to CONSTANTLY be changing direction. The following page seems much better:

But the guy made the sensors and circuit boards himself. Any suggestions where to get something like this? As soon as I Google “ultrasonic” I just get distance devices, not generic receivers that output a voltage (or resistance) related to the distance. A device like that would work easily with the Spark Core. I want to follow a sound produced by a cell phone so need to stay below about 20kHz. Any ideas?

I guess I will look into hacking an ultrasonic distance sensor.

I think I have found the answer, PIXY

Vision color detection board from Carnegie Mellon. It can give a voltage output for the left-right location of an object, but better than that if you use I2C it can give the width of the object, which means the Rover can slow down as if gets closer, if the width is bigger. The only problem is, that now I need to learn I2C with the spark, and I wanted to stay away from the whole Grove Seeedstudio Spark shield thing at

Any suggestions for using I2C without the Grove shield?

How about going all out… like the video below :slight_smile: well maybe that one is a bit much… but you get the idea! maybe a XV-11 hack is in order, or maybe a newer cheaper pulsed led version here

@rocksetta, I’ve had a Pixy for a while waiting for my attention. I’ll port the Arduino library for it, which uses I2C, and post it for you :smile:

1 Like

Any luck porting the library yet, I just ordered my Pixy:

Base (Pan and Tilt) at:

I actually want to put the thing on my rover without the Pan and Tilt base, I don’t think it is needed since the robot can turn and the camera does not have to tilt to see the source object, but the cost for the base was so cheap $32 USD ($39 Can), I thought I had better get it.

@rocksetta, I’ll be looking at it this weekend :stuck_out_tongue_winking_eye:

1 Like

I think the library works out of the box on the core… using SPI. I will be trying mine out on Tuesday night when i get home :slight_smile:


@Hootie81, very little needs to be done. I will be testing with I2C so let me know how it goes. :smiley:

1 Like

@Hootie81. Really nice work. I was waiting for info about I2C, but I think you have covered most of what is needed for SPI. Thank you.

@Hootie81 Hi Chris: I am getting a:

fatal error: SPI.h: No such file or directory

I notice that your github site

also does not have the file, any hints about what you did? On the Arduino it would be automatically included with the include statement, that is not the case on the spark or is it?

Thats not actually my github site, its another member of the awesome spark community @harrisonhjones

Im pretty sure SPI.h is in the core firmware already, and links in automagicly so you may be able to just comment out the line that gives the error, but if that doesn’t work you may need to look at the code and see if its in quotes instead of the greater than signs, ie. #include "SPI.h" and not like the arduino way #include <SPI.h>

1 Like

Thanks @Hootie81 blanking out the SPI.h include line lets the code compile, (Have to wait for my Pixy to see if it works). I will make a github site if I have any success. So far I got the code to turn my rover left if pixy.blocks[0].x < 70 etc and to speed up the rover if pixy.blocks[0].width < 70. Will have to play around with the number 70. It stops the rover if the width gets too wide, meaning the object the rover follows is very close.

1 Like

Sounds awesome! i was meant to be home yesterday but thats been changed to friday now… so i will have to wait till then to get the pixy hooked up and tested…

Finally got my Pixy in the mail, from chamed labs. It works great hooked up to the computer, brightly colored balls work much better than regular objects and I was a bit surprised that shades of grey do not work. (It detects Hue not brightness).

Surprised how many parts come with the pan/tilt setup. Glad I hopefully will not need it. However there is a great tutorial on putting it together and I think it works even without a micro controller, can be controlled out of the box from the Pixy camera.

So I am going to start testing with the spark core. Will try to get working but would really prefer to light the D7 LED if an object of a certain width is detected instead of all the serial print stuff. If anyone has working Pixy code I would really like to look at it.

Once D7 works It hopefully will be a small jump to start activating my rover’s turning motor.


I got my Pixy the other day… and then got so excited i stuffed it in a minion :slight_smile: now i’m waiting for a new core, some motors and a accelerometer/gyro to arrive so i can make a segway style balancing thingy for the minion to ride :slight_smile:

Haven’t had a chance to look at the code as ive been super busy! and i’ve run out of cores that aren’t tied up in other projects. Hope to test a few things over the next couple of weeks and get started porting a balancing bot code and making it work with the SPI motor controller i made a while ago.

its a bit scary how easy it was to hide the pixy in the minion, i think they were made for it

1 Like

Got my Pixy working on the first flash to the core, (Seriously I can’t even get a web page to work the first time). I will try to make a simple D7 LED lighting program but this rover thing was kind of cool. Here is the code:

// By Jeremy Ellis.
// Rover should move slowly unless several seconds no response from the pixy then it should stop.
// should move left or right based on where the object is.
// should stop if the object gets to big a width ie it is close.

#define FRAME_SKIP 25                // How many frames to skip. Lower numbers cause faster response

/* Globals -------------------------------------------------------------------*/
// Note my pololu big motor driver needs two digital and one analog PWM controls for the drive motor VNH5019

int myForward          = D6;      // set or change these as needed
int myBackward         = D5;      // set or change these as needed
int myDriveMotor       = A0;    

// I am using the cheap pololu motor driver that only needs one digital and one analog pins for the turning motor DRV8835
//    note: this can drive two motors but I needed the bigger driver for the 2.3 amp drivemotor

int myLeftRight        = D0;       // different motor driver does both left and right
int myTurnMotor        = A1;   //A4;  // this one must be changed since it is used by the Pixy along with A3,A4,A5

int myD7               = D7;      // to test if wifi is working
int myLast             =;   // for emergency motor stop after 4 s  no activity from the pixy

// Begin User Defined Defines
#define FRAME_SKIP 25                // How many frames to skip. Lower numbers cause faster response
// End User Defined Defines

//      #include "SPI.h"        // got rid of this since it would not compile
#include "Pixy.h"
#include "TPixy.h"

// Not sure how to flash new code so will not do this yet.
//SYSTEM_MODE(MANUAL);                // Set the system mode to manual. We don't need the cloud for this code

Pixy pixy;                          // Create our pixy object
int i = 0;                          // Create an int to count for frame skipping. Frame skipping is used to artificially slow down the pixy loop

// Setup - Runs Once @ Startup
void setup() {   
//    Spark.function("my-main", myMain);  
    pinMode(myDriveMotor, OUTPUT);
    pinMode(myForward, OUTPUT);
    pinMode(myBackward, OUTPUT);
    pinMode(myTurnMotor, OUTPUT);
    pinMode(myLeftRight, OUTPUT);

    pinMode(myD7, OUTPUT);    
    //Serial.begin(9600);             // Initalize the USB Serial port
   // pixy.init();                    // Initalize the pixy object
    //PUT YOUR SETUP CODE HERE. Note: Only three more functions allowed!
    // test everything using the return int from a function!
    //RGB.color(0, 255, 255);  //cyan
    RGB.brightness(1);    // 1=very low light, 255 = max

    Serial.begin(9600);             // Initalize the USB Serial port
    pixy.init();                    // Initalize the pixy object

// Loop - Runs over and over
void loop(){ 
    if ( >= myLast + 4){                // car may be out of control or wifi down or pixy down
         analogWrite(myDriveMotor,  0);           // shut both motors down
         analogWrite(myTurnMotor,   0);  
         RGB.brightness(100);                     // show that no processing lately
    // Variable Creation 
    uint16_t blocks;                // Create an unsigned int to hold the number of found blocks
    char buf[50];                   // Create a buffer for printing over serial

    blocks = pixy.getBlocks();      // Do the pixy stuff. Grab the numbers of blocks the pixy finds
  // If we have some blocks increment the frame counter (i) and if enough frames have passed print out data about the found blocks to the serial port
  if (blocks) {
    myLast =;           // update time to see if car still in wifi control
    //  Deal with all the turning x = 0 - 319
    if (pixy.blocks[0].x < 150){   //object on the left so car should turn left
        digitalWrite(myLeftRight, 1);
        analogWrite(myTurnMotor,  250); 
    }  else {
       if (pixy.blocks[0].x > 190){   //object on the right so car should turn right
         digitalWrite(myLeftRight, 0);
         analogWrite(myTurnMotor,  250); 
       }  else {   // object in the middle so just go straight
            analogWrite(myTurnMotor,  0);    
    }  // end turning  main if
   // deal with the speed of the rover  width = 0 - 320
    if (pixy.blocks[0].width < 30){   //Object is very far away so go faster
        digitalWrite(myForward, 1);
        digitalWrite(myBackward, 0);
        analogWrite(myDriveMotor,  250);
    } else {    
        if (pixy.blocks[0].width > 70){   //Object is very wide so it is too close so stop
            analogWrite(myDriveMotor,  0);
        }   else {   // object a good distance away go a medium speed 0-250
                    digitalWrite(myForward, 1);
                    digitalWrite(myBackward, 0);
                    analogWrite(myDriveMotor,  100);
  } // end blocks
}   // end loop

Other than the weird motor driver stuff, most of this has been copied from except I omitted the // #include “SPI.h” file in the main.ino and in each of the Pixy.h and TPixy.h. you can see the github site at

Haven’t done a road test yet, since I have to connect the PIXY to the rover, but the main drive wheels go forward and stop when the object is close and the turning wheels turn to follow the object. Bright simply colored objects work best.

Made some good images to go with the github site above

Generic Spark Core PINS image

Pololu small motor driver

Pololu big motor driver

Pixy serial connection to the spark core

Hope this is helpful

1 Like