ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. Ethan Wu
    3. Posts
    • Profile
    • Following 0
    • Followers 1
    • Topics 2
    • Posts 19
    • Best 0
    • Controversial 0
    • Groups 0

    Posts made by Ethan Wu

    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev
      Hi, thank you for your assistance over the past few days. I am now able to successfully stream videos from Trip to Voxl and QGC using the code I provided earlier, combined with pympa. The decoding issue I mentioned occurred because the video streamed from Trip2 was in YUV420 format, but I reshaped the data in RGB format.

      As for the original method using OpenCV + GStreamer, I am still unsure where the problem lies. It could be a decoding issue or an installation problem with OpenCV and GStreamer. Additionally, there were many unexpected situations related to GStreamer during the research process that have not yet been resolved. If I manage to identify the source of the problem, I will provide further information.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev
      Hi,
      Do I just need to upload it to voxl and dpkg -i opencv.deb so that it will overwrite my opencv-python? I did so and failed to start the steam as expected.

      Also, I searched for some other replacement for opencv and got the code below :

      import gi
      gi.require_version('Gst', '1.0')
      from gi.repository import Gst, GObject, GLib
      import numpy as np
      import time
      import cv2
      import sys
      
      # Initialize GStreamer
      Gst.init(None)
      
      # Define the RTSP stream URL
      stream_url = 'rtsp://192.168.0.201:554/live0'
      
      # Create a GStreamer pipeline
      pipeline = Gst.parse_launch(f"rtspsrc location={stream_url} latency=0 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! appsink")
      
      # Start the pipeline
      pipeline.set_state(Gst.State.PLAYING)
      
      # Main loop to read frames from pipeline and publish them to MPA
      frame_cntr = 0
      while True:
          # Retrieve a frame from the pipeline
          sample = pipeline.get_by_name('appsink0').emit('pull-sample')
          buffer = sample.get_buffer()
          result, info = buffer.map(Gst.MapFlags.READ)
          if result:
              # Convert the frame to numpy array
              data = np.ndarray((info.size,), dtype=np.uint8, buffer=info.data)
              frame = np.reshape(data, (640, 720, 3))
              # Increment frame counter
              frame_cntr += 1
              sys.stdout.write("\r")
              sys.stdout.write(f'got frame {frame_cntr} with dims {frame.shape}')
              sys.stdout.flush()
              # Publish the frame to MPA
              cv2.imshow("frame", frame)
          else:
              print("Error mapping buffer")
          # Delay for a short period to control frame rate
          if cv2.waitKey(30) == ord('q'):
              break
      
      
      cv2.destroyAllWindows()
      # Stop the pipeline
      pipeline.set_state(Gst.State.NULL)
      
      

      And I saw image like this:

      Screenshot from 2024-03-22 17-55-03.png

      Which looks like a decode error, and I will look into it next week.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev You can set the environment variable GST_DEBUG to the number that fits the debug level you prefer. I made it 2 so that I can see warning and error output from gst-launch-1.0, here's the reference documentation : https://gstreamer.freedesktop.org/documentation/tutorials/basic/debugging-tools.html?gi-language=c

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev
      Sharing opencv package would be great, and I also want to share how I built opencv that supports gstreamer. I looked into this issue and followed instructions here, and made sure I saw the followings information to get gstreamer working with opencv by calling cv2.getBuildInformation() :

        Video I/O:
          DC1394:                      NO
          FFMPEG:                      YES
            avcodec:                   YES (57.107.100)
            avformat:                  YES (57.83.100)
            avutil:                    YES (55.78.100)
            swscale:                   YES (4.8.100)
            avresample:                NO
          GStreamer:                   YES (1.14.5)
          v4l/v4l2:                    YES (linux/videodev2.h)
      
      

      I was using conda for virtual environment and run opencv with python 3.10 (3.9 and 3.8 are also tested), and my opencv version is 4.9.0.80.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Ethan-Wu

      About the error 'unexpected reference "gst-launch-1" ', I do get the same error when I replace the gstreamer command with gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink, so I think that's not a problem.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev
      I think the client side is freezing, and I noticed that VideoCapture takes some time to complete, but ultimately hangs on cap.read(). Below is the relevant debug information:

      created output pipe rtsp-debug, channel 0, flags 0
      gst-launch-1.0 rtspsrc location=rtsp://169.254.4.201:554/live0 latency=0  ! queue !  rtph264depay ! h264parse config-interval=-1 ! qtivdec ! qtivtransform !  video/x-raw,format=BGR,width=1280,height=720 !  autovideoconvert ! appsink
      0:00:00.000749593 14685   0x55c069b780 WARN            GST_REGISTRY gstregistrybinary.c:489:gst_registry_binary_check_magic: Binary registry magic version is different : 1.3.0 != 1.12.0
      0:00:00.057125316 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:876:priv_gst_parse_yyparse: unexpected reference "gst-launch-1" - ignoring
      0:00:00.057160626 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:882:priv_gst_parse_yyparse: unexpected pad-reference "0" - ignoring
      0:00:00.057168959 14685   0x55c069b780 WARN     GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory "rtspsrc"!
      0:00:00.057173647 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "rtspsrc"
      0:00:00.057890325 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@0x55c070f730]
      0:00:00.057908970 14685   0x55c069b780 WARN     GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory "rtph264depay"!
      0:00:00.057914021 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "rtph264depay"
      0:00:00.057918969 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no sink [source=@0x55c070f730]
      0:00:00.057923292 14685   0x55c069b780 WARN     GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory "h264parse"!
      0:00:00.057927354 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "h264parse"
      0:00:00.057932250 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@(nil)]
      0:00:00.057936260 14685   0x55c069b780 WARN     GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory "qtivdec"!
      0:00:00.057939437 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "qtivdec"
      0:00:00.057944072 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@(nil)]
      0:00:00.057947874 14685   0x55c069b780 WARN     GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory "qtivtransform"!
      0:00:00.057951467 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "qtivtransform"
      0:00:00.057956884 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@(nil)]
      0:00:00.057971726 14685   0x55c069b780 WARN     GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory "autovideoconvert"!
      0:00:00.057979851 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "autovideoconvert"
      0:00:00.057985215 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@(nil)]
      0:00:00.058182496 14685   0x55c069b780 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@0x55c06f51b0]
      
      (python:14685): GStreamer-CRITICAL **: 08:49:15.037: gst_caps_get_structure: assertion 'GST_IS_CAPS (caps)' failed
      
      (python:14685): GStreamer-CRITICAL **: 08:49:15.038: gst_structure_get_int: assertion 'structure != NULL' failed
      [ WARN:0@30.090] global cap_gstreamer.cpp:1714 open OpenCV | GStreamer warning: cannot query video width/height
      
      (python:14685): GStreamer-CRITICAL **: 08:49:15.038: gst_structure_get_fraction: assertion 'structure != NULL' failed
      [ WARN:0@30.090] global cap_gstreamer.cpp:1722 open OpenCV | GStreamer warning: cannot query video fps
      [ WARN:0@30.090] global cap_gstreamer.cpp:1777 open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
      
      (python:14685): GStreamer-CRITICAL **: 08:50:20.041: gst_sample_get_caps: assertion 'GST_IS_SAMPLE (sample)' failed
      [ERROR:0@95.092] global cap_gstreamer.cpp:934 retrieveVideoFrame GStreamer: gst_sample_get_caps() returns NULL
      
      (python:14685): GStreamer-CRITICAL **: 08:51:25.042: gst_sample_get_caps: assertion 'GST_IS_SAMPLE (sample)' failed
      [ERROR:0@160.094] global cap_gstreamer.cpp:934 retrieveVideoFrame GStreamer: gst_sample_get_caps() returns NULL
      ^C
      (python:14685): GStreamer-CRITICAL **: 08:52:30.044: gst_sample_get_caps: assertion 'GST_IS_SAMPLE (sample)' failed
      [ERROR:0@225.096] global cap_gstreamer.cpp:934 retrieveVideoFrame GStreamer: gst_sample_get_caps() returns NULL
      Traceback (most recent call last):
        File "/voxl-mpa-tools/tools/python/rtsp_rx_mpa_pub.py", line 62, in <module>
          ret, frame = vcap.read()
      KeyboardInterrupt
      
      

      Today, I referenced the official website and tested the situation of OpenCV, gstreamer, and MPA with other example commands. The result is that the image produced by gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink can ultimately be seen on the portal. So, I don't think it's an issue with gstreamer installation?

      My RTSP source is from an external camera (edge computing device) acting as a server, connected to VOXL via Ethernet for transmission. Currently, I am not using VOXL's built-in camera or UVC camera, and I am not generating an RTSP stream through VOXL's services.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev

      Thank you very much, that really works! And it's REALLY COOL!

      Now, the remaining issue is with GStreamer support. When streaming RTSP, GStreamer often freezes without any indication or error messages, making it difficult to identify the problem. Although this may not be the most suitable question to ask here, I still want to seek advice and see if there are other ways besides blindly testing different combinations.

      Thank you for your assistance.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Ethan-Wu Or perhaps my gstreamer installation is not completely right? Here's the document I read for installation : https://gstreamer.freedesktop.org/documentation/installing/on-linux.html?gi-language=c

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev
      Hi,
      I'm now able to stream rtsp with opencv-python on my PC, but fail on voxl, and I think that might be some environmental issue. I'm using conda and following this script to install gstreamer support opencv, and the program just stuck at VideoCapture() whenever start running. Here are some information after I hit ctrl+C to stop it :

      ^C
      (python:21276): GStreamer-CRITICAL **: 07:27:40.646: 
      Trying to dispose element queue0, but it is in PAUSED instead of the NULL state.
      You need to explicitly set elements to the NULL state before
      dropping the final reference, to allow them to clean up.
      This problem may also be caused by a refcounting bug in the
      application or some element.
      
      
      (python:21276): GStreamer-CRITICAL **: 07:27:40.646: 
      Trying to dispose element pipeline0, but it is in READY instead of the NULL state.
      You need to explicitly set elements to the NULL state before
      dropping the final reference, to allow them to clean up.
      This problem may also be caused by a refcounting bug in the
      application or some element.
      
      [ WARN:0@1.170] global /tmp/tmp.zudKvFNc4N/opencv-python-master/opencv/modules/videoio/src/cap_gstreamer.cpp (1356) open OpenCV | GStreamer warning: unable to start pipeline
      [ WARN:0@1.170] global /tmp/tmp.zudKvFNc4N/opencv-python-master/opencv/modules/videoio/src/cap_gstreamer.cpp (862) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created
      
      (python:21276): GStreamer-CRITICAL **: 07:27:40.646: 
      Trying to dispose element appsink0, but it is in READY instead of the NULL state.
      You need to explicitly set elements to the NULL state before
      dropping the final reference, to allow them to clean up.
      This problem may also be caused by a refcounting bug in the
      application or some element.
      
      Traceback (most recent call last):
        File "/playground/play.py", line 17, in <module>
          vcap = cv2.VideoCapture(stream,cv2.CAP_GSTREAMER)
      KeyboardInterrupt
      
      

      I'm currently dealing with it and not sure whether it's a problem from voxl or not.

      Also, since MPA API is written in C++, I'm curious that even if I success to stream RTSP in python, how should I connect them?

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev
      As for converting video to YUV, I haven't reach that part, but I think opencv might be used for that conversion.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev
      Hi, thanks for replying, that really helps. I was intended to use opencv C++ to read rtsp stream, at least to read a local video for testing. I looked into voxl-mpa-tools which makes use of voxl-opencv as one of its dependency, and wrote a simple program to read video source like below :

      #include <opencv2/opencv.hpp>
      #include <iostream>
      
      using namespace cv;
      using namespace std;
      
      int main() {
          string filename;
          cout << "Enter the file name of the video: ";
          cin >> filename;
      
          VideoCapture cap(filename);
      
          if (!cap.isOpened()) {
              cerr << "Error: Unable to open video file." << endl;
              return -1;
          }
          // play video on local host
      }
      

      While it can played both mp4 and RTSP source on my PC, isOpened always returns False on voxl.

      I also install python3 and opencv-python on voxl and test it out, but it's able to open mp4 but failed to read RTSP. That goes same on my PC, so I might get something wrong.

      I have used ffmpeg to confirm that the RTSP URL is correct, so I don't think it's a problem with the URL being incorrect.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev Originally, I was thinking of achieving simple image transmission by modifying the example code under libmodal-pipe, and then connecting it to both MPA and the tflite server. Would you advise against this approach?

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      @Alex-Kushleyev Sure, a few examples would be great. I think the format is yuv, thank you very much.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • Seeking Reference Code for MPA Integration with RTSP Video Streams for TFLite Server

      Hi,

      I'd like to read a video stream from an RTSP source and then pass it through MPA to connect to a tflite server. Where can I find good reference code for handling video streams with MPA?

      Thanks.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Image transmission between Trip2 and VOXL2

      @Ethan-Wu I just did some research to find out that it seems that Ethernet can't be treated like a UVC camera, right?

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Image transmission between Trip2 and VOXL2

      @Eric-Katzfey Additionally, I apologize if my initial question was unclear. I understand that VOXL supports UVC type cameras. I'd like to know if cameras transmitted via an Ethernet adapter can be used on VOXL? Or would I need to make significant modifications to the code to achieve this?

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Image transmission between Trip2 and VOXL2

      @Eric-Katzfey Thank you for replying, now the problem is solved and I think is a hardware issue that leads to segmentation fault.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • RE: Image transmission between Trip2 and VOXL2

      @Eric-Katzfey

      Hello,
      I changed the ethernet connection on my Trip2 camera to use HDMI converted to USB. After running voxl-uvc-server -l, the device information is detected, but when debugging, a segmentation fault is reported. Any idea what could be causing this? Here is the error information:

      voxl2:/$ voxl-uvc-server -f 5 -d
      
      loading config file
      Enabling debug messages
      =================================================================
      width:                            640
      height:                           480
      fps:                              5
      pipe_name:                        uvc
      ==============================================================
      voxl-uvc-server starting
      Image resolution 640x480, 5 fps chosen
      UVC initialized
      Device found
      Device opened
      uvc_get_stream_ctrl_format_size succeeded for format YUYV
      Streaming starting
      
      Segmentation fault:
      Fault thread: voxl-uvc-server(tid: 2617)
      Fault address: 0x7fae693000
      Access to this address is not allowed.
      Segmentation fault
      

      Any insights on what might be causing the segmentation fault would be appreciated.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu
    • Image transmission between Trip2 and VOXL2

      Good afternoon,

      I am currently working on integrating a Trip2 camera with a voxl2 device, aiming to establish image streaming via Ethernet. However, I am encountering challenges as the voxl2 device does not recognize the Trip2 camera, despite a successful ping connection.

      In an effort to better understand the handling of UVC camera image information within the voxl2 server, I am interested in exploring relevant sections of the voxl2 codebase. Could you kindly guide me to specific resources or code segments, such as those in voxl-streamer, that would enhance my comprehension of how UVC camera data is processed on the voxl2 server?

      Furthermore, I am curious about the potential for modifying these code sections to achieve successful image transfer between the Trip2 camera and the voxl2 device. Your insights and guidance on this matter would be greatly appreciated.

      Thank you for your time and assistance.

      posted in Ask your questions right here!
      Ethan WuE
      Ethan Wu