Recording camera frames to file using MPA
-
Hi, is there any existing or planned support for recording camera frames to a file using the MPA like there was using voxl-rtsp?
-
@czarsimon Are you interested in recording the raw frames coming from the sensor (voxl-camera-server) or the compressed h264 stream coming from voxl-streamer?
-
@Eric-Katzfey Ideally I would like to be able to record 4K video while streaming at a lower resolution to save on bandwidth. To answer your question I suppose it would be the h264 stream. Does either app have the ability to export video to a file? I wasn't able to find any references to it in the documentation or looking through the source. Thanks
-
@czarsimon Unfortunately, voxl-streamer does not currently support that, and we don't have a date when it will be added. You are more than welcome to add the feature to voxl-streamer. It shouldn't be too hard. You just "tee" the stream at some point and send to a "filesink" element.
-
@Eric-Katzfey please tell me how to recording the raw frames coming from the sensor (voxl-camera-server) to a SD card?
-
Raw frames can be had onboard using voxl-logger. For example,
voxl-logger --cam stereo --samples 60
will record 60 frames in png format from the stereo camera. The frames are currently stored on /data/voxl-logger/log<XXXX>/run. You'll need to move them to the sdcard.
-
@Cliff-Wong Thanks Cliff this is great, is the source for this app available? I can't find it on Gitlab.
-
The source lives here in the mpa-tools repo
-
@Alex-Gardner Sweet, thanks!
-
@Cliff-Wong Thanks! Happened.
Please tell me how to record video to SD? -
Hi @Eric-Katzfey, I've been revisiting this lately and modified the voxl-streamer app like you suggested with a tee to a filesink. I am however seeing a strange behavior where the video file is very short and missing most of the frames. Any idea why that might be? My modified pipeline follows the image overlay element and is this:
gst_element_link_many(last_element, context->encoder_queue, context->omx_encoder, context->filesink_tee, NULL); gst_element_link_many(context->filesink_tee, context->rtp_filter, context->rtp_queue, context->rtp_payload, NULL); gst_element_link_many(context->filesink_tee, context->h264_parser_queue, context->h264_parser, context->filesink_queue, context->filesink, NULL);
-
I've been trying to build a simple command line pipeline to test the filesink:
gst-launch-1.0 videotestsrc num-buffers=100 ! video/x-raw,width=640,height=480,framerate=30/1 ! omxh264enc ! mp4mux ! filesink location=test_file.mp4
But it doesn't seem to be want to play nice with omx:
ERROR:/opt/workspace/build/apq8096-le-1-0-1_ap_standard_oem.git/apps_proc/poky/build/tmp-glibc/work/armv7a-vfp-neon-oemllib32-linux-gnueabi/lib32-gstreamer1.0-omx/1.10.4-r0/gst-omx-1.10.4/omx/gstomxh264enc.c:532:gst_omx_h264_enc_get_caps: code should not be reached
It works just fine on my laptop if I replace omxh264enc with x264enc. Does this mean that the problem is not with gstreamer but with the omx plugin?
-
@czarsimon omx has a very limited set of input formats. I would add format NV12 (or NV21) to your caps filter in between videotestsrc and omx.