Python Programmatic GStreamer Access for Hardware Encoded Acceleration and Low Latency
-
Hello,
I am using the VOXL2 with the Hadron 640R. I was originally using a standard RB5 with the Hadron 640R development kit and flashed with the FLIR Ubuntu 18 image. In order to record in Python on my original RB5 implementation I used a GStreamer Record Pipeline set up like this in Python:
record_pipeline = ( f"qtiqmmfsrc name=qmmf ! " f"video/x-h264,profile=high,width={self.width},height={self.height} ! h264parse ! " f"splitmuxsink max-size-time={segment_ns} muxer=mp4mux location=\"{location_pattern}\"" )It appears on the VOXL2 this pipeline does not work, what is the pipeline that I should use to accomplish this, as I would prefer not to use RTSP. Our overall goal is a low latency stream in Python that we were able to accomplish on a standard RB5.
Thanks in advance!
-
Hi @Vinny this is a part of our Hadron interface project. We're utilizing the commands straight from FLIR but it appears to not be working with VOXL2. We've got the camera servers up and running via VOXL2 commands. However there seems to be a slight variation in the commands and some nuance that we're missing.
Thank you!
Joseph Porter -
Hi @joseph-vale
This is not my wheel house, but maybe @Alex-Kushleyev can give you some pointers.
Thanks!
Vinny -
You did not provide the actual error that you are seeing, however I could try to guess what it is (even if not, the details below should probably help you anyway). The default build of
voxl-opencvpackage does not have python3 support. So if you are using gstreamer with opencv in python and that is the error, you should install thevoxl-opencvpackage that I built with python3 support- https://storage.googleapis.com/modalai_public/temp/voxl-opencv_4.5.5-3_arm64.deb
- source : https://gitlab.com/voxl-public/voxl-sdk/third-party/voxl-opencv/-/tree/add-python3-bindings
Below, i will assume that you want to grab images from Boson part of the Hadron, however similar approach should apply to the RGB camera in Hadron.
First you should test ability to grab images without python. You may need to replace your camera # depending on what camera id your Boson is.
IMPORTANT: make sure that
voxl-camear-serveris not running while you are trying to use gstreamer.systemctl stop voxl-camera-serverTip: you can actually stream video using X forwarding with ssh. This should stream live Boson feed from voxl2 to your linux machine:
ssh -Y username@<voxl-ip> gst-launch-1.0 qtiqmmfsrc camera=0 ! "video/x-raw,width=640,height=512,framerate=30/1" ! videoconvert ! autovideosinkdisplay image directly in terminal as ascii:
gst-launch-1.0 qtiqmmfsrc camera=0 ! "video/x-raw,width=640,height=512,framerate=30/1" ! autovideoconvert ! aasinkand then finally, a python script that grabs h264 video from qtiqmmfsrc, decodes it and returns frames to python:
import time import cv2 #get RGB (BGR?) directly #stream='gst-launch-1.0 qtiqmmfsrc camera=0 ! video/x-raw, width=640,height=512,framerate=30/1 ! autovideoconvert ! appsink' #get h264 -> decode -> BGR stream='gst-launch-1.0 qtiqmmfsrc camera=0 ! video/x-h264,format=NV12,profile=high,width=640,height=512,framerate=30/1 ! h264parse ! qtivdec ! qtivtransform ! video/x-raw,format=BGR,width=640,height=512 ! autovideoconvert ! appsink' print(stream) vcap = cv2.VideoCapture(stream,cv2.CAP_GSTREAMER) frame_cntr = 0 while(1): ret, frame = vcap.read() if ret: frame_cntr += 1 print('got frame %d with dims ' % frame_cntr, frame.shape)Hopefully, that works for you.
Final recommendation - if you use
qtiqmmfsrcthis way, the Boson data is processed in the Qualcomm ISP and unless your have a special tuning file for Boson, the processed output will have degraded quality. Boson, by default, outputs post AGC 8-bit image which is already processed and does not need to be further processed by the ISP. I am not sure whether you can get RAW8 data from qtiqmmfsrc (unmodified data from Boson).We handle the above issue in
voxl-camera-serverby working with the RAW8 directly. We also recently started experimenting with 14bit pre-AGC data from Boson, which would need some processing before it is usable (if you are interested in that, i can share some more information).Finally, if you would like to use
voxl-camera-server, which is what we recommend and support, there is also a way to get encoded h264/h265 data into python (using our experimental pympa (python MPA bindings)). That is a topic for a discussion in another post, if you are interested..Alex
-
@Alex-Kushleyev Hi Alex, thanks for the awesome thorough response!
Ok we've got recordings going again. We're doing it through the camera-server. A couple follow up questions:
- How do we calculate the thermal radiometric readings utilizing your method? For FLIR they had equations for high and low gain but this seems different.
- How do we start two "voxl-streamer" and the "voxl-camera-server" on startup?
Thanks!
Joseph -
@Vinny no worries and thanks!!
-
to enable
voxl-streamerandvoxl-camera-serveron startup, just use the following commands:systemctl enable voxl-camera-serversystemctl enable voxl-streamer
Regarding your question about
thermal radiometric readings, i am not sure - can you please elaborate? The default post-AGC 8-bit mode sends a monochrome processed image. The pixel value is related to the temperature, but the image itself does not provide the mapping from pixel value to temperature. Also, not all Boson units support outputting radiometric data.I don't have much experience with this aspect (and I don't think we have any Bosons with radiometric output capability). Looking at some FLIR help, it seems that you have to use the 16 bit output (well it's actually 14 bit) and turn on linear T output and then the conversion from RAW pixel value (16 bit) to degrees is simple : https://flir.custhelp.com/app/answers/detail/a_id/3387/~/flir-oem---boson-video-and-image-capture-using-opencv-16-bit-y16
If this is the case, then here is how this could be tested (high level steps - don't worry if you don't know how to implement them at this point) :
- set up Boson to correct configuration (output RAW14, linear T, etc) using the FLIR SDK (using USB)
- configure VOXL2 to use boson driver that accepts 14 bit data (not 8-bit, which is default)
- voxl-camera-server will publish RAW16 unmodified images to an mpa pipe
- a client application can receive the RAW16 frame and apply the temperature conversion and publish the image that reflects certain temperature -> color mapping. Then this image can be used by
voxl-streamerto be encoded with h264 / h265.
I have not actually tried that script (at the bottom of that help article) -- i wonder what would happen if i use it with a Boson that does not support radiometric output. Do you know?
I can help set this up if i can test it using non-radiometric Boson. It seems the conversion is straightforward, I could potentially add the support for this directly into
voxl-camera-server.Alex
-
@Alex-Kushleyev this is awesome thank you. We're using the FLIR 640R+ sensor that just came out not terribly long ago which has the 20mK sensitivity. I will try those methods and get back to you on what works! This would be super cool to get directly into the camera-server as a stream. We are using it for helping guide animal localization on our aerial survey system.
Thanks,
Joseph -
Hi @joseph-vale , i tested the python script from FLIR help site (
Radiometry.py). I just had to modify it to use the correct USB and Video devices. The script ran find, but since my Boson does not support radiometric output, the reported temperature was like 70 degrees colder than it should be (reporting -50C at room temperature). Are you able to get correct temperatures with your device using this script?As i mentioned before, there is a way of getting the image data from
voxl-camera-serverinto python. I think it would be interesting to try running the same exact conversion and annotation code from the FLIR example. This would allow you to first check the temperatures using a USB connection and then check them using the VOXL2 pipeline.I am going to set up an example that that uses
pympa(python wrapper for MPA) to get the 16bit data from Boson and plot + convert it to temperature using the reference code.Alex