ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. svempati
    S
    • Profile
    • Following 0
    • Followers 0
    • Topics 5
    • Posts 19
    • Best 0
    • Controversial 0
    • Groups 0

    svempati

    @svempati

    0
    Reputation
    3
    Profile views
    19
    Posts
    0
    Followers
    0
    Following
    Joined Last Online

    svempati Unfollow Follow

    Latest posts made by svempati

    • RE: No detections when running custom YOLOv8 model on voxl-tflite-server

      @Zachary-Lowell-0 Thanks for sharing the video! I looked at it, and I pretty much did the same steps you did. It might be worth mentioning that I had to modify the Dockerfile because the one in the documentation was throwing a version mismatch error when installing the onnx part.

      This was the original docker command

      RUN pip3 install ultralytics tensorflow onnx "onnx2tf>1.17.5,<=1.22.3" tflite_support onnxruntime onnxslim "onnx_graphsurgeon>=0.3.26" "sng4onnx>=1.0.1" tf_keras
      

      I modified it to this

      RUN pip3 install ultralytics tensorflow "onnx2tf>1.17.5,<=1.22.3" tflite_support onnxruntime onnxslim "onnx_graphsurgeon>=0.3.26" "sng4onnx>=1.0.1" tf_keras
      RUN pip3 install onnx==1.20.1
      

      I don't think this should cause any issues, but could you confirm?

      posted in VOXL 2
      S
      svempati
    • RE: No detections when running custom YOLOv8 model on voxl-tflite-server

      @Zachary-Lowell-0 I would first like to diagnose what is causing the yolov8 model to not work on the voxl 2 for me. Will it only work when you train the model/ export it to tflite on a TPU? I am getting the issue even if I train the yolov8 model on an open source dataset/ use the pretrained yolov8n.pt model downloaded from ultralytics. I want to make sure I can train a yolov8 model on an open source dataset from scratch works on the voxl, so that I can move on to using my custom dataset.

      In case there is no other workaround then I should be able to send you the dataset I am using.

      Thanks!

      posted in VOXL 2
      S
      svempati
    • RE: No detections when running custom YOLOv8 model on voxl-tflite-server

      @Zachary-Lowell-0 Just wanted to follow up to see if you were able to replicate this issue?

      posted in VOXL 2
      S
      svempati
    • RE: No detections when running custom YOLOv8 model on voxl-tflite-server

      @Zachary-Lowell-0 I wanted to follow up with you again on this, and the issue seems to be the model conversion process from pytorch to tflite. To confirm this, I tried it with the default yolov8n.pt downloaded from ultralytics by entering this command from the gitlab repository:

      python export.py yolov8n.pt
      

      So that I create a new yolov8n_float16.tflite file. However, running this file on voxl-tflite-server shows this output before displaying Error in TensorData<float>: should not reach here:

      WARNING: Unknown model type provided! Defaulting post-process to object detection.
      INFO: Created TensorFlow Lite delegate for GPU.
      INFO: Initialized OpenCL-based API.
      INFO: Created 1 GPU delegate kernels.
      Successfully built interpreter
      
      ------VOXL TFLite Server------
      
       4 5 6
       4 5 6
      Connected to camera server
      
      

      I even tried running export.py on the voxl emulator to account for any differences in the cpu architecture between my computer and the VOXL (between X86 and ARM) but I still get the same error. Do you think there is anything I would be missing? Thank you!

      posted in VOXL 2
      S
      svempati
    • RE: No detections when running custom YOLOv8 model on voxl-tflite-server

      @Zachary-Lowell-0 Got it, I will try that out and will let you know if I have any more questions. Thanks for your help!

      posted in VOXL 2
      S
      svempati
    • RE: No detections when running custom YOLOv8 model on voxl-tflite-server

      I see, so my model is not supported by the voxl-tflite-server since it is float16 and the tflite server only supports 32 bit precision if we want to use floating point values. Am I understanding that correctly or am I missing something? Cause the default YOLOv5 model that is included in the VOXL 2 model (yolov5_float16_quant.tflite) is also of float 16 precision so I wonder how the functions in tensor_data.h handle that.

      One question, what command did you use to view these error logs from voxl-tflite server?

      Error in TensorData<float>: should not reach here
      Error in TensorData<float>: should not reach here
      Error in TensorData<float>: should not reach here
      Error in TensorData<float>: should not reach here
      Error in TensorData<float>: should not reach here
      Error in TensorData<float>: should not reach here
      Error in TensorData<float>: should not reach here
      Error in TensorData<float>: should not reach here
      
      posted in VOXL 2
      S
      svempati
    • RE: No detections when running custom YOLOv8 model on voxl-tflite-server

      Hi @Zachary-Lowell-0, Yes I am confirming that I followed the instructions in that gitlab repository.
      Here is the tflite file and labels file: https://drive.google.com/drive/folders/1kyjanabVSP_pH_jsQyjQG9z6hFYZ_iij?usp=drive_link

      posted in VOXL 2
      S
      svempati
    • No detections when running custom YOLOv8 model on voxl-tflite-server

      Hello,

      I am trying to run a custom YOLOv8 model on the voxl-tflite-server. The model detects ships and the yolov8_labels.txt file only contains one ship class. However, when I run the tflite server and view it on voxl-portal I can see the video feed, but cannot see any bounding box detections even when the target is in the camera frame.
      I tried another variation for the labels file by having the class index and label name like this: 0 ship, but that doesn't work either.
      I also ran voxl-inspect-detections but it doesn't show any detections there.

      When I tested the default yolov5 and yolov8 models on voxl-tflite-server, it displays the bounding boxes and shows the list of detections in voxl-inspect-detections just fine.

      If it helps, I used this command to convert the YOLOv8 model to the tflite format:

      yolo export model=best.pt format=tflite
      

      I use the quantized 16 bit tflite model named yolov8_best_float16.tflite.

      This is how I set up the config file /etc/modalai/voxl-tflite-server.conf:

      {
      "skip_n_frames":	0,
      "model":	"/usr/bin/dnn/yolov8_best_float16.tflite",
      "input_pipe":	"/run/mpa/front_small_color/",
      "delegate":	"gpu",
      "requires_labels":	true,
      "labels":	"/usr/bin/dnn/yolov8_labels.txt",
      "allow_multiple":	false,
      "output_pipe_prefix":	"yolov8"
      }
      

      Is there anything I missed that is leading to no detections on the voxl-tflite-server?

      I would appreciate any help!

      posted in VOXL 2
      S
      svempati
    • RE: VOXL Connection to MAVLINK server failed using pymavlink

      @tom No I don't have a wifi dongle connected.

      posted in Ask your questions right here!
      S
      svempati
    • RE: VOXL Connection to MAVLINK server failed using pymavlink

      @tom I ran ifconfig on voxl2, but I haven't found anything on wlan0. This is the output I get after running that command:

      bond0: flags=5123<UP,BROADCAST,MASTER,MULTICAST>  mtu 1500
              ether fe:ca:d2:74:91:69  txqueuelen 1000  (Ethernet)
              RX packets 0  bytes 0 (0.0 B)
              RX errors 0  dropped 0  overruns 0  frame 0
              TX packets 0  bytes 0 (0.0 B)
              TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0
      
      dummy0: flags=195<UP,BROADCAST,RUNNING,NOARP>  mtu 1500
              inet6 fe80::e2a:ce35:6c11:d803  prefixlen 64  scopeid 0x20<link>
              ether de:92:20:5e:07:c9  txqueuelen 1000  (Ethernet)
              RX packets 0  bytes 0 (0.0 B)
              RX errors 0  dropped 0  overruns 0  frame 0
              TX packets 25  bytes 7462 (7.4 KB)
              TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0
      
      lo: flags=73<UP,LOOPBACK,RUNNING>  mtu 65536
              inet 127.0.0.1  netmask 255.0.0.0
              inet6 ::1  prefixlen 128  scopeid 0x10<host>
              loop  txqueuelen 1000  (Local Loopback)
              RX packets 368130  bytes 35420354 (35.4 MB)
              RX errors 0  dropped 0  overruns 0  frame 0
              TX packets 368130  bytes 35420354 (35.4 MB)
              TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0
      

      would any of these help with finding the voxl's ip address?

      posted in Ask your questions right here!
      S
      svempati