Camera problems

@kmparticle
Hi Kent,

Thank you for the update.

Well at least I do not feel that stupid anymore after reading this thread. I read the Tachyon promo and bough a RaspiCam with a Gaint Lense and all to simply find out it does not work and is not on the list. So I grabbed what was marked on the list, the IMX519.

Thank you guys here in the thread, for at least providing some data to at least get an out of focus picture.

Now before diving in here I actually spent several hours (or more) traying to cross complie a driver, realizing that I would never match all dependancies and architecture Tachyon used on the Kernel I switched to compiling directly on the Tachyon.

Can someone point me in the right direction for the sources ? As they are not part of the image and I can’t seem to locate them anywhere.

Using the test commands in this thread, the pipeline seems to be running for ever until interrupted manually, anything I have tried seems to fail (just mentioning).

As my camera is upsidedown I currently have this here producing a single out of focus picture:

gst-launch-1.0 -e qtiqmmfsrc camera=0 name=qmmf \
! video/x-raw,format=NV12,width=1920,height=1080,framerate=30/1 \
! videoconvert ! videoflip method=rotate-180 \
! jpegenc \
! multifilesink location=/home/particle/snaprotate.jpg

Question on the side is there (other than USB Stuff) any camera at all out there that will work with Tachyon?

For anyone who landed here, check the “Expected availability of Ubuntu 24 build“ thread. Someone is working on integrating a working driver for the camera.

Ok, so I got the new version of the Ubuntu 20 Image (172) which to my understanding should have all the needed drivers in it. However the Cam ist still delivering unusable stuff.

gst-launch-1.0 -e
qtiqmmfsrc camera=0 num-buffers=1
control-mode=3 focus-mode=continuous
! video/x-raw,format=NV12,width=1920,height=1080
! jpegenc
! filesink location=/home/particle/daylight.jpg

I get no errors when asking for focus, but it is not happening either. As I get no response the whole image just turns dark and still is out of focus.

So am I going about this the wrong way? Is there someone out there getting sharp bright images?

Hi Steffen. Could you add --gst-debug=qtiqmmfsrc:LOG to your command and share the output?

Sorry, am not able to attach a file here I guess: root@tachyon-1a838cd9:~# gst-launch-1.0 -e --gst-debug=qtiqmmfsrc:LOG qtiqmmfsrc camera=0 num-buffers=1 control-mode=3 focus-mode=continuous ! video/x-raw,format=NV12,width=1920,height=1080 ! jpegenc ! filesink location=/home/particle/daylight.jpg
0:00:00.058196597 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:1664:qmmfsrc_init:GstQmmfSrc@0x5597309080 Initializing
gbm_create_device(192): Info: backend name is: msm_drm
0:00:00.189780020 7086 0x55972e2730 INFO qtiqmmfsrc qmmf_source_context.cc:1099:gst_qmmf_context_new: Created QMMF context: 0x559730a090
0:00:00.190015019 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190110852 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190205540 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190319185 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190411372 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190497466 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190579966 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190683507 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190768403 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190856007 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.190966111 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.191090590 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.191202360 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:332:get_vendor_tag_by_name: Failed to retrieve Global Vendor Tag Descriptor!
0:00:00.206447172 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:403:qmmfsrc_request_pad: Requesting video pad video_0 (0)
0:00:00.206728005 7086 0x55972e2730 INFO qtiqmmfsrc qmmf_source_video_pad.c:570:qmmfsrc_video_pad_flush_buffers_queue:<'':video_0> Flushing buffer queue: FALSE
0:00:00.207150608 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:324:video_pad_activate_mode:<'':video_0> Video Pad (0) mode: ACTIVE
0:00:00.207198889 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:430:qmmfsrc_request_pad: Created pad with index 0
0:00:00.207300712 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:191:video_pad_query:qmmfsrc0:video_0 Received QUERY caps
0:00:00.207402587 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:199:video_pad_query:qmmfsrc0:video_0 Template caps: image/jpeg, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw, format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw(memory:GBM), format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-bayer, format=(string){ bggr, rggb, gbrg, grbg, mono }, bpp=(string){ 8, 10, 12, 16 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]
0:00:00.207505191 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:202:video_pad_query:qmmfsrc0:video_0 Filter caps: image/jpeg, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw, format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw(memory:GBM), format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-bayer, format=(string){ bggr, rggb, gbrg, grbg, mono }, bpp=(string){ 8, 10, 12, 16 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]
0:00:00.207580034 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:247:video_pad_event:qmmfsrc0:video_0 Received EVENT reconfigure
0:00:00.207650138 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:191:video_pad_query:qmmfsrc0:video_0 Received QUERY caps
0:00:00.207745607 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:199:video_pad_query:qmmfsrc0:video_0 Template caps: image/jpeg, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw, format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw(memory:GBM), format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-bayer, format=(string){ bggr, rggb, gbrg, grbg, mono }, bpp=(string){ 8, 10, 12, 16 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]
0:00:00.207842325 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:202:video_pad_query:qmmfsrc0:video_0 Filter caps: image/jpeg, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw, format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw(memory:GBM), format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-bayer, format=(string){ bggr, rggb, gbrg, grbg, mono }, bpp=(string){ 8, 10, 12, 16 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]
0:00:00.207997221 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:191:video_pad_query:qmmfsrc0:video_0 Received QUERY caps
0:00:00.208076544 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:199:video_pad_query:qmmfsrc0:video_0 Template caps: image/jpeg, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw, format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw(memory:GBM), format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-bayer, format=(string){ bggr, rggb, gbrg, grbg, mono }, bpp=(string){ 8, 10, 12, 16 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]
0:00:00.208172221 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:202:video_pad_query:qmmfsrc0:video_0 Filter caps: image/jpeg, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw, format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw(memory:GBM), format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-bayer, format=(string){ bggr, rggb, gbrg, grbg, mono }, bpp=(string){ 8, 10, 12, 16 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]
Setting pipeline to PAUSED ...
0:00:01.238669768 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:191:video_pad_query:qmmfsrc0:video_0 Received QUERY caps
0:00:01.238773934 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:199:video_pad_query:qmmfsrc0:video_0 Template caps: image/jpeg, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw, format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw(memory:GBM), format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-bayer, format=(string){ bggr, rggb, gbrg, grbg, mono }, bpp=(string){ 8, 10, 12, 16 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]
0:00:01.238838517 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:202:video_pad_query:qmmfsrc0:video_0 Filter caps: image/jpeg, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw, format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-raw(memory:GBM), format=(string){ NV12, NV16, P010_10LE, NV12_10LE32 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]; video/x-bayer, format=(string){ bggr, rggb, gbrg, grbg, mono }, bpp=(string){ 8, 10, 12, 16 }, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 240/1 ]
0:00:01.238979767 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:487:qmmfsrc_video_pad_fixate_caps:qmmfsrc0:video_0 Trying to fixate caps: video/x-raw, format=(string)NV12, width=(int)1920, height=(int)1080, framerate=(fraction)[ 0/1, 240/1 ]
0:00:01.239018413 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:514:qmmfsrc_video_pad_fixate_caps:qmmfsrc0:video_0 Framerate not set, using default value: 30/1
0:00:01.239041381 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:156:video_pad_send_stream_start:qmmfsrc0:video_0 Pushing STREAM_START
0:00:01.239497839 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:561:qmmfsrc_video_pad_fixate_caps:qmmfsrc0:video_0 Caps fixated to: video/x-raw, format=(string)NV12, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
0:00:01.239563464 7086 0x55972e2730 INFO qtiqmmfsrc qmmf_source.c:294:qmmfsrc_pad_reconfigure: Reconfiguration for pad video_0 in READY state
0:00:01.239589141 7086 0x55972e2730 INFO qtiqmmfsrc qmmf_source.c:298:qmmfsrc_pad_reconfigure: Reconfigure video pad
0:00:01.243423456 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:339:get_vendor_tag_by_name: Unable to locate tag for 'c2dCropX', section 'org.codeaurora.qcamera3.c2dCropParam'!
0:00:01.243472466 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:1413:gst_qmmf_context_create_video_stream: Failed to update X axis crop value
0:00:01.243500643 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:339:get_vendor_tag_by_name: Unable to locate tag for 'c2dCropY', section 'org.codeaurora.qcamera3.c2dCropParam'!
0:00:01.243525435 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:1418:gst_qmmf_context_create_video_stream: Failed to update Y axis crop value
0:00:01.243549705 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:339:get_vendor_tag_by_name: Unable to locate tag for 'c2dCropWidth', section 'org.codeaurora.qcamera3.c2dCropParam'!
0:00:01.243574653 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:1423:gst_qmmf_context_create_video_stream: Failed to update crop width
0:00:01.243598924 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:339:get_vendor_tag_by_name: Unable to locate tag for 'c2dCropHeight', section 'org.codeaurora.qcamera3.c2dCropParam'!
0:00:01.243621684 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:1428:gst_qmmf_context_create_video_stream: Failed to update crop height
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
0:00:01.244660953 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:270:qmmfsrc_pad_flush_buffers: Flush pad: video_0
0:00:01.244676630 7086 0x55972e2730 INFO qtiqmmfsrc qmmf_source_video_pad.c:570:qmmfsrc_video_pad_flush_buffers_queue:qmmfsrc0:video_0 Flushing buffer queue: FALSE
0:00:01.245270952 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:339:get_vendor_tag_by_name: Unable to locate tag for 'mode', section 'org.codeaurora.qcamera3.ir_led'!
0:00:01.245316889 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:339:get_vendor_tag_by_name: Unable to locate tag for 'masterExpTime', section 'org.codeaurora.qcamera3.multicam_exptime'!
0:00:01.245340952 7086 0x55972e2730 WARN qtiqmmfsrc qmmf_source_context.cc:339:get_vendor_tag_by_name: Unable to locate tag for 'slaveExpTime', section 'org.codeaurora.qcamera3.multicam_exptime'!
New clock: GstSystemClock
0:00:01.607648485 7086 0x5597320d80 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:191:video_pad_query:qmmfsrc0:video_0 Received QUERY latency
0:00:01.607713745 7086 0x5597320d80 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:225:video_pad_query:qmmfsrc0:video_0 Latency 0:00:00.033333333/99:99:99.999999999
0:00:01.607762130 7086 0x5597320d80 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:247:video_pad_event:qmmfsrc0:video_0 Received EVENT latency
^Chandling interrupt.
Interrupt: Stopping pipeline ...
EOS on shutdown enabled -- Forcing EOS on the pipeline
0:00:18.325386350 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:934:qmmfsrc_send_event: Event: eos
0:00:18.325573277 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:953:qmmfsrc_send_event: Pushing EOS event downstream
0:00:18.325660568 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:250:qmmfsrc_pad_push_event: Event: eos
0:00:18.325985880 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:270:qmmfsrc_pad_flush_buffers: Flush pad: video_0
0:00:18.326066870 7086 0x55972e2730 INFO qtiqmmfsrc qmmf_source_video_pad.c:570:qmmfsrc_video_pad_flush_buffers_queue:qmmfsrc0:video_0 Flushing buffer queue: TRUE
Waiting for EOS...
0:00:18.326221922 7086 0x55970a0aa0 INFO qtiqmmfsrc qmmf_source_video_pad.c:181:video_pad_worker_task:qmmfsrc0:video_0 Pause video pad worker thread
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 0:00:16.946837341
Setting pipeline to NULL ...
0:00:18.330328477 7086 0x55972e2730 INFO qtiqmmfsrc qmmf_source_video_pad.c:570:qmmfsrc_video_pad_flush_buffers_queue:qmmfsrc0:video_0 Flushing buffer queue: TRUE
0:00:18.331683840 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source_video_pad.c:324:video_pad_activate_mode:qmmfsrc0:video_0 Video Pad (0) mode: STOPED
0:00:18.331913214 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:270:qmmfsrc_pad_flush_buffers: Flush pad: video_0
0:00:18.332021131 7086 0x55972e2730 INFO qtiqmmfsrc qmmf_source_video_pad.c:570:qmmfsrc_video_pad_flush_buffers_queue:qmmfsrc0:video_0 Flushing buffer queue: TRUE
Freeing pipeline ...
0:00:18.813841051 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:479:qmmfsrc_release_pad: Releasing video pad 0
0:00:18.813980738 7086 0x55972e2730 DEBUG qtiqmmfsrc qmmf_source.c:510:qmmfsrc_release_pad: Deleted pad 0
0:00:18.818583335 7086 0x55972e2730 INFO qtiqmmfsrc qmmf_source_context.cc:1115:gst_qmmf_context_free: Destroyed QMMF context: 0x559730a090
root@tachyon-1a838cd9:~#

Yeaah - I can’t get the camera to work on the newer 20.04 build!
Focus still does not work and the image seems to be always dull so not sure whats going on there!
This is my script sofar… I've tried many combinations of options

#!/bin/bash
# The IMX519 sensor natively supports:

# 4656x3496 (16MP full resolution)
# 3840x2160 (4K UHD)
# 1920x1080 (Full HD)
# 1280x720 (HD)
# 640x480 (VGA)

export XDG_RUNTIME_DIR=/run/user/root
rm photo*
timeout 2s gst-launch-1.0 -e qtiqmmfsrc camera=0 num_buffers=1 control-mode=0 focus-mode=auto ! video/x-raw,format=NV12,width=1920,height=1080,framerate=30/1 ! jpegenc ! filesink location=~/photo.jpg
echo "Photo captured successfully! ($?)"

Hi, it is quite frustrating to have a camera project, but no working camera. Can we get an update. Ubuntu old or new, I do not care. I would like to use my Tachyon with a camera.

@kmparticle - can we get an update on this topic?

Hi @Steffen. We do have a fix for the autofocus that we have been testing. It looks promising, and the patch should be included in the next software release. Our current estimate for the release is the end of this week or early next week as we are trying to include other updates and fixes as well. Thank you for your patience!

If you remember at that time, please mention it here when it is out. I will jump on it as fast as I can. Thank you.

Good news - can't wait. Is the fix in the 24.04 release also?

Is there any more news/dates on the next release (20.04/24.04) please????

Hey @Perky I got Ubuntu 20 v176 from @kmparticle to test. Sorry that I just now got around to testing. But for all it is worth, I got an IN FOCUS Pic of my ugly face. I asked Ken to post it here (not shure if I am allowed or supposed to. Will now get going and tune those settings - can’t wait for the weekend where there is ample time for this stuff.

Doing some more testing with the Ubuntu 20 Headless from @kmparticle . Like I said the focus now works, but the current QMMF / GStreamer integration on Tachyon does not properly support
num-buffers=1, eos-after=1 or clean EOS signaling.

This results in pipelines that hang or generate 0-byte images.

Below is a fully working workaround for this version that reliably produces a single, focused image:

#!/bin/bash

TMPDIR=/home/particle
OUT=$TMPDIR/snapshot.jpg

gst-launch-1.0 qtiqmmfsrc camera=0 control-mode=3 focus-mode=continuous \
! video/x-raw,format=NV12,width=1920,height=1080 \
! videoconvert ! jpegenc \
! multifilesink location=$TMPDIR/tmp-%03d.jpg &
PID=$!

sleep 0.5
kill $PID

LATEST=$(ls -1 $TMPDIR/tmp-*.jpg | tail -n 1)
cp "$LATEST" "$OUT"

rm -f $TMPDIR/tmp-*.jpg
echo "Saved: $OUT"

This lets the sensor/ISP warm up ~500 ms, then extracts the newest frame —
resulting in a sharp, correct, properly exposed JPEG.

Hi @Steffen Appreciate you testing this stuff out!

Just to clarify, you are talking about the -e, --eos-on-shutdown option documented here? I have noticed that killing the pipeline with ctrl-c can lead to incomplete / 0 byte files like you pointed out. I have been using -e but havent tested enough to confirm that it still has issues.

I thought I would also share a programmatic approach to using gstreamer with python. There is a good example of how to run a pipeline in python here. It looks like it handles the ctrlc / EOS behavior better.

It needs some tweaking to run on tachyon, namely changing the encoder from v4l2h264enc to qtic2venc, but I had success running it.

particle@tachyon-c9200f54:~/camera$ ./gst-camera-encode.py --help
usage: gst-camera-encode.py
       [-h]
       [-c {0,1}]
       [-cw WIDTH]
       [-ch HEIGHT]
       [-cf FRAMERATE]
       [--output OUTPUT]

This app sets up GStreamer pipeline for video recording.
Initializes and links elements for capturing live stream from camera
and saving the encoded video as OUTPUT.

optional arguments:
  -h, --help
    show this help message and exit
  -c {0,1}, --camera {0,1}
    Select (0) for Primary Camera and (1) for Secondary Camera. (default: 0)
  -cw WIDTH, --width WIDTH
    Camera Output Width (default: 1920)
  -ch HEIGHT, --height HEIGHT
    Camera Output Height (default: 1080)
  -cf FRAMERATE, --framerate FRAMERATE
    Camera Output Framerate (fraction) (default: 30/1)
  --output OUTPUT
    Output File Path (default: /home/particle/recording.mp4)

I havent tried out a lot of the parameters but the script is worth tinkering with.

#!/usr/bin/env python3

################################################################################
# Copyright (c) 2024-2025 Qualcomm Innovation Center, Inc. All rights reserved.
# SPDX-License-Identifier: BSD-3-Clause-Clear
################################################################################

import os
import sys
import signal
import argparse


#sudo apt install python3-gi gir1.2-gstreamer-1.0 
import gi
gi.require_version('Gst', '1.0')
gi.require_version("GLib", "2.0")
from gi.repository import Gst, GLib, GObject

# Constants
DESCRIPTION = """
This app sets up GStreamer pipeline for video recording.
Initializes and links elements for capturing live stream from camera
and saving the encoded video as OUTPUT.
"""
DEFAULT_OUTPUT_FILE = "/home/particle/recording.mp4"

waiting_for_eos = False
eos_received = False
def handle_interrupt_signal(pipeline, mloop):
    """Handle Ctrl+C."""
    global waiting_for_eos

    _, state, _ = pipeline.get_state(Gst.CLOCK_TIME_NONE)
    if state != Gst.State.PLAYING or waiting_for_eos:
        mloop.quit()
        return GLib.SOURCE_CONTINUE

    event = Gst.Event.new_eos()
    if pipeline.send_event(event):
        print("EoS sent to the pipeline")
        waiting_for_eos = True
    else:
        print("Failed to send EoS event to the pipeline!")
        mloop.quit()
    return GLib.SOURCE_CONTINUE

def handle_bus_message(bus, message, mloop):
    """Handle messages posted on pipeline bus."""
    global eos_received

    if message.type == Gst.MessageType.ERROR:
        error, debug_info = message.parse_error()
        print("ERROR:", message.src.get_name(), " ", error.message)
        if debug_info:
            print("debugging info:", debug_info)
        mloop.quit()
    elif message.type == Gst.MessageType.EOS:
        print("EoS received")
        eos_received = True
        mloop.quit()
    return True

def create_element(factory_name, name):
    """Create a GStreamer element."""
    element = Gst.ElementFactory.make(factory_name, name)
    if not element:
        raise Exception(f"Unable to create element {name}")
    return element

def link_elements(link_order, elements):
    """Link elements in the specified order."""
    src = None  # Initialize src to None at the start of each link_order
    for element in link_order:
        dest = elements[element]
        if src and not src.link(dest):
            raise Exception(
                f"Unable to link element {src.get_name()} to "
                f"{dest.get_name()}"
            )
        src = dest  # Update src to the current dest for the next iteration

def parse_arguments():
    """Parse command line arguments."""
    parser = argparse.ArgumentParser(
        description=DESCRIPTION,
        formatter_class=type(
            'CustomFormatter',
            (argparse.ArgumentDefaultsHelpFormatter, argparse.RawTextHelpFormatter),
            {}
        )
    )

    parser.add_argument(
        '-c', '--camera', type=int, choices=[0, 1], default=0,
        help='Select (0) for Primary Camera and (1) for Secondary Camera.'
    )
    parser.add_argument(
        '-cw', '--width', type=int, default=1920,
        help='Camera Output Width'
    )
    parser.add_argument(
        '-ch', '--height', type=int, default=1080,
        help='Camera Output Height'
    )
    parser.add_argument(
        '-cf', '--framerate', type=str, default='30/1',
        help='Camera Output Framerate (fraction)'
    )

    parser.add_argument(
        "--output", type=str, default=DEFAULT_OUTPUT_FILE,
        help="Output File Path"
    )

    return parser.parse_args()

def create_pipeline(pipeline, args):
    """Initialize and link elements for the GStreamer pipeline."""
    # Create elements
    elements = {
        "camsrc" : create_element("qtiqmmfsrc", "camsrc"),
        "camcaps": create_element("capsfilter", "camcaps"),
        "encoder": create_element("qtic2venc", "encoder"),
        "parser" : create_element("h264parse", "parser"),
        "mux"    : create_element("mp4mux", "mux"),
        "sink"   : create_element("filesink", "sink")
    }

    queue_count = 2
    for i in range(queue_count):
        queue_name = f"queue{i}"
        elements[queue_name] = create_element("queue", queue_name)

    # Set properties
    elements["camsrc"].set_property("camera", args.camera)
    elements["camcaps"].set_property(
        "caps", Gst.Caps.from_string(
            "video/x-raw,format=NV12,"
            f"width={args.width},height={args.height},"
            f"framerate={args.framerate}"
        )
    )

    # elements["encoder"].set_property("capture-io-mode", "dmabuf")
    # elements["encoder"].set_property("output-io-mode", "dmabuf-import")

    elements["sink"].set_property("location", args.output)

    # Add elements to the pipeline
    for element in elements.values():
        pipeline.add(element)

    # Link elements
    link_order = [
        "camsrc", "camcaps", "encoder", "parser", "queue0",
        "mux", "queue1", "sink"
    ]
    link_elements(link_order, elements)

def is_linux():
    try:
        with open("/etc/os-release") as f:
            for line in f:
                if "Linux" in line:
                    return True
    except FileNotFoundError:
        return False
    return False

def print_pipeline_structure(pipeline):
    print("Pipeline structure:")
    it = pipeline.iterate_elements()
    while True:
        result, element = it.next()
        if result == Gst.IteratorResult.OK:
            factory = element.get_factory()
            factory_name = factory.get_name() if factory else "no-factory"
            print(f"  Element: {element.get_name()} ({factory_name})")
        elif result == Gst.IteratorResult.DONE:
            break
        elif result == Gst.IteratorResult.RESYNC:
            it.resync()
        else:
            break

def main():
    """Main function to set up and run the GStreamer pipeline."""

    # Set the environment
    # if is_linux():
    #     os.environ["XDG_RUNTIME_DIR"] = "/dev/socket/weston"
    #     os.environ["WAYLAND_DISPLAY"] = "wayland-1"

    os.environ["XDG_RUNTIME_DIR"] = "/run/user/root"

    # Initialize GStreamer
    Gst.init(None)
    # Gst.debug_set_default_threshold(Gst.DebugLevel.INFO)

    mloop = GLib.MainLoop()

    # Parse arguments
    args = parse_arguments()

    # Create the pipeline
    try:
        pipeline = Gst.Pipeline.new("video-recording-pipeline")
        if not pipeline:
            raise Exception(f"Unable to create video recording pipeline")
        create_pipeline(pipeline, args)
        print_pipeline_structure(pipeline)
    except Exception as e:
        print(f"{e} Exiting...")
        return -1

    # Handle Ctrl+C
    interrupt_watch_id = GLib.unix_signal_add(
        GLib.PRIORITY_HIGH, signal.SIGINT, handle_interrupt_signal, pipeline, mloop
    )

    # Wait until error or EOS
    bus = pipeline.get_bus()
    bus.add_signal_watch()
    bus.connect("message", handle_bus_message, mloop)

    # Start playing
    print("Setting to PLAYING...")
    pipeline.set_state(Gst.State.PLAYING)
    mloop.run()

    GLib.source_remove(interrupt_watch_id)
    bus.remove_signal_watch()
    bus = None

    print("Setting to NULL...")
    pipeline.set_state(Gst.State.NULL)

    mloop = None
    pipeline = None
    Gst.deinit()
    if eos_received:
        print("App execution successful")

if __name__ == "__main__":
    sys.exit(main())

Hi @sbrust

thanks for following up.

Yes — I am indeed referring to the -e / --eos-on-shutdown option as documented.
And yes — it still has the same issues.

At the moment, qtiqmmfsrc does not emit a proper EOS event, even when:

  • -e is used
  • num-buffers=1 is set
  • a single-frame capture is explicitly requested

The expected behavior would be:
produce one valid frame and then cleanly terminate the pipeline with EOS.

Instead:

  • qtiqmmfsrc ignores num-buffers=1
  • no EOS is sent
  • the GStreamer pipeline hangs indefinitely until an external timeout or a CTRL-C abort

So unless I'm misunderstanding something, this is not matching the documented behavior.

Regarding the Python example you shared:
I appreciate it, but I don't need a video-pipeline framework or async event handling. The Python code does not solve the underlying issue — it only handles EOS if EOS actually exists. The root problem is inside qtiqmmfsrc, not in the pipeline wrapper.

Here’s the current situation for single-frame capture on Tachyon:

  • qtiqmmfsrc sends no EOS, regardless of -e
  • num-buffers=1 is ignored
  • eos-after=1 has no visible effect
  • the first several frames are black (warm-up)
  • autofocus also needs warm-up, but is not synchronized with usable frame output
  • therefore, reliable single-shot capture currently requires workaround “hammer” solutions (timeouts + frame discarding)

Once EOS signaling and warm-up synchronization are fixed, single-frame capture should work without hacks.

TL;DR for the dev team

  • qtiqmmfsrc ignores num-buffers=1
  • qtiqmmfsrc does not emit EOS, even with -e
  • eos-after=1 has no effect
  • first frames are always black (warm-up not handled)
  • autofocus warm-up is not synchronized with valid frame output
  • result: single-frame capture is impossible without hacks (timeout + discarding frames)

Fix EOS + warm-up handling → single-frame capture works normally.

Once you have something on this I will be more than glad to test it. Thank you.

Thanks Steffen, appreciate you answering but the developers are ignoring me and so totally lost interest and concentrating my efforts elsewhere --- good luck.

Just installed Ubuntu NA desktop 1.0.180 bundle.
Tachyon booted up fine with WiFi / cellular connectivity, plus the Arducam IMX 519 camera autofocus works!

What are you doing/using to get a single picture out of curiosity?