ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Sending Recorded Video Though Camera Server on VOXL2

    Ask your questions right here!
    2
    2
    99
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • R
      reber34
      last edited by

      Hi all!

      I would like to send a prerecorded video though the camera server to better understand how camera server configurations effect video quality at different transmission distances. I would like to do this so I can have consistent video for each of my tests vs different video outputted by the camera for each test. Does anyone know how someone would configure the voxl 2 to be able to do this?

      Thanks!

      Alex KushleyevA 1 Reply Last reply Reply Quote 1
      • Alex KushleyevA
        Alex Kushleyev ModalAI Team @reber34
        last edited by

        @reber34 , perhaps this approach can work for you:

        • record a video encoded at high bit rate (using voxl-camera-server and voxl-record-video . Please note that the output of voxl-record-video will not be in a standard container (such as mp4, etc), but you can fix it with ffpeg : ffmpeg -r 30 -i voxl-record-video.h264 -codec copy videofile.mp4
        • re-encode the video offline with desired codecs / bit rates / resolutions
        • install gst-rtsp-launch which uses gstreamer to set up an RTSP stream https://github.com/sfalexrog/gst-rtsp-launch/
          • you will first need to figure out what gstreamer pipeline to use on voxl2 that will load your video and parse the h264/h265 frames (can use null sink for testing) and then use that pipeline with gst-rtsp-launch which will take the encoded frames and serve them over rtsp stream.
        • gstreamer may be more flexible for tuning the encoding parameters of h264/h265 (compared to voxl-camera-server) and you can also use it in real time later (using voxl-streamer, which uses gstreamer under the hood)

        Another alternative is to use voxl-record-raw-image to save raw YUVs coming from voxl-camera-server and then use voxl-replay and voxl-streamer - the latter will accept YUVs from the MPA pipe and encode them using the bit rate that you want. Note that depending on the image resolution, YUV images will take a lot more space than encoded video, but maybe that is also OK since VOXL2 has lots of storage.

        Alex

        1 Reply Last reply Reply Quote 2
        • First post
          Last post
        Powered by NodeBB | Contributors