#Spark + Pixy
Hi everyone! I wanted to take a few minutes to make a post about a project I’ve been working on. I was able to get my hands on a Pixy which is this awesome imagine processing sensor which serves as a co-processor for other microcontrollers. It allows microcontrollers like the Spark Core to effectively track colored objects in real time (along with some other cool features) with very little processor overhead. The Pixy does all the image “hard work” which leaves the Spark Core free to do other things like turn a servo or announce the presence of a yellow car in the driveway!
Port the Pixy Arduino library to the SparkCore and then write a few examples to showcase the library.
Connecting the Spark Core to the Pixy
Connecting the Spark Core to the Pixy over SPI is rather simple. Using the cable provided with the pixy, put a jumber between the connector (shown below) and the Spark Core (shown on the table).
Porting the Library
This SPI version of the library was quite easy to port: nothing had to be changed. It worked right out of the box. All it needs access to is the SPI communication channel on the Spark Core and that’s obtained by
#include "SPI.h" which is the same for the Arduino IDE so it’s already there.
I have not yet had time to look at the other communication protocols but from a quick glance they look pretty straightforward.
Showcasing the Library
I wanted to write a few examples to show that the Spark Core could both communicate with the Pixy and do things with the data returned. I decided to write three examples. The first is a “hello world” example. This is ported almost directly from the “hello world” example from the Arduino library. It simply prints out all object/block information from the Pixy. The second and third examples both use a single servo to pan the pixy toward a detected object. These were fun to write because I got to build the servo pan mechanism and see it move and track the object. In the end they both worked fairly well. A picture of the Pixy + Servo setup can be seen below.
Source / Github
You can view the source on Github here.
A YouTube video of the first tracking example can be found here on YouTube