Great tutorial for object detection application

After receiving the Tachyon, I had difficulties to get started. There was only the audio module and I was not clear about what to try and with which sensors.

I saw a great tutorial for setting up an object detection solution, which uses the Tachyon’s AI-accelerator. It uses a plain USB web cam. The sample solution is trained to detect coloured blocks, but this can be something else as well.

This looked feasible for me. I ordered a LogiTech C270 web cam and managed to get everything working. For this solution, no real coding is required. You have to be familiar with some machine learning concepts (or just replicate what the author did). I realized the solution on my Tachyon that runs Ubuntu 20.04.6LTS, which required a bit of tweaking at the OS-level.

It is amazing to see how Edge Impulse can be used to set up, train and build a model, using the Tachyon for sample capturing and to run the optimized model. I was very happy to have a sample application that works end-to-end, and to see how easy this is once the basic infrastructure talks with Edge Impulse in the cloud.

My model’s performance is crappy at this moment. I know why and I am glad that I am already in the stage of improving that.

2 Likes

Thanks for posting this, one thing I want to do with mine is run Frigate on it and use the AI accelerator for object detection for the NVR. I'll have to see if I can get it to work with Frigate once I get UB 24 flashed to it.

Thanks, @ErwinE for sharing my project! I recently published a robotics project using the Tachyon.

4 Likes

That’s a pretty cool project, Naveen!!

1 Like

Thank you, Gustavo!

Cool new project. I liked the object detection posting because it creates a working solution in a step-by-step tutorial.

1 Like

Another great learning for me was to get the near real-time object detection working.

It is using a YOLOv8 model that is compiled (tflite) for the Tachyon using qai-hub. For inference, a Python script running on the Tachyon directly calls the tflite-model and processes the output. Exactly as I wanted.

This posting describes it as part of a test of the Tachyon: Particle Tachyon Review - A Qualcomm QCM6490 Edge AI and 5G cellular SBC tested with Ubuntu - CNX Software

Seems like the AI Engine Direct module isn't available on the website anymore. Do we install the entire SDK?

For the inference tutorial written by Naveen, I installed Edge Impulse CLI on the Tachyon. The installer also takes care of the prerequisites. This is enough to start training the model. For this, start edge-impulse-linuxon the Tachyon.

Once there is a model file (eim), it must be downloaded / copied to the Tachyon, and can then be run with command edge-impulse-linux-runner --model-file mymodel-linux-aarch64-qnn-v1.eim.

1 Like

Ok thanks I'll give that a shot.

I'm actually trying to get the Tachyon to work with Frigate, but it seems they have no support for Qualcom hardware acceleration or object detection. I'd like to try and get Frigate to use the AI accelerator in the Tachyon but I'm no sure how that is going to work out. Any tips on setting up ffmpeg with the Tachyon?