ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    No detections when running custom YOLOv8 model on voxl-tflite-server

    VOXL 2
    2
    8
    41
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • S
      svempati
      last edited by svempati

      Hello,

      I am trying to run a custom YOLOv8 model on the voxl-tflite-server. The model detects ships and the yolov8_labels.txt file only contains one ship class. However, when I run the tflite server and view it on voxl-portal I can see the video feed, but cannot see any bounding box detections even when the target is in the camera frame.
      I tried another variation for the labels file by having the class index and label name like this: 0 ship, but that doesn't work either.
      I also ran voxl-inspect-detections but it doesn't show any detections there.

      When I tested the default yolov5 and yolov8 models on voxl-tflite-server, it displays the bounding boxes and shows the list of detections in voxl-inspect-detections just fine.

      If it helps, I used this command to convert the YOLOv8 model to the tflite format:

      yolo export model=best.pt format=tflite
      

      I use the quantized 16 bit tflite model named yolov8_best_float16.tflite.

      This is how I set up the config file /etc/modalai/voxl-tflite-server.conf:

      {
      "skip_n_frames":	0,
      "model":	"/usr/bin/dnn/yolov8_best_float16.tflite",
      "input_pipe":	"/run/mpa/front_small_color/",
      "delegate":	"gpu",
      "requires_labels":	true,
      "labels":	"/usr/bin/dnn/yolov8_labels.txt",
      "allow_multiple":	false,
      "output_pipe_prefix":	"yolov8"
      }
      

      Is there anything I missed that is leading to no detections on the voxl-tflite-server?

      I would appreciate any help!

      1 Reply Last reply Reply Quote 0
      • Zachary Lowell 0Z
        Zachary Lowell 0 ModalAI Team
        last edited by

        Hello @svempati can you paste your yolov8_labels and tflite file and I can test it out on my end?

        To confirm - you followed the instructions in this gitlab repository: https://gitlab.com/voxl-public/support/voxl-train-yolov8

        Zach

        S 1 Reply Last reply Reply Quote 0
        • S
          svempati @Zachary Lowell 0
          last edited by

          Hi @Zachary-Lowell-0, Yes I am confirming that I followed the instructions in that gitlab repository.
          Here is the tflite file and labels file: https://drive.google.com/drive/folders/1kyjanabVSP_pH_jsQyjQG9z6hFYZ_iij?usp=drive_link

          1 Reply Last reply Reply Quote 0
          • Zachary Lowell 0Z
            Zachary Lowell 0 ModalAI Team
            last edited by Zachary Lowell 0

            @svempati said in No detections when running custom YOLOv8 model on voxl-tflite-server:

            "labels": "/usr/bin/dnn/yolov8_labels.txt",

            So running your model we get the following errors via voxl-tflite-server:

            Error in TensorData<float>: should not reach here
            Error in TensorData<float>: should not reach here
            Error in TensorData<float>: should not reach here
            Error in TensorData<float>: should not reach here
            Error in TensorData<float>: should not reach here
            Error in TensorData<float>: should not reach here
            Error in TensorData<float>: should not reach here
            Error in TensorData<float>: should not reach here
            

            Which means there is an issue in your model itself and most likely means you ran into an issue during the build process. Specifically this means that is a model issue, not a labels file issue. Your .tflite model has an output tensor with a different data type than what voxl-tflite-server expects

            The code itself shows the error when you hit this case statement:

            // Gets the uint8_t tensor data pointer
            template <>
            inline uint8_t *TensorData(TfLiteTensor *tensor, int batch_index)
            {
            int nelems = 1;

            for (int i = 1; i < tensor->dims->size; i++)
            {
                nelems *= tensor->dims->data[i];
            }
            
            switch (tensor->type)
            {
            case kTfLiteUInt8:
                return tensor->data.uint8 + nelems * batch_index;
            default:
                fprintf(stderr, "Error in %s: should not reach here\n",
                        __FUNCTION__);
            }
            
            return nullptr;
            

            }

            Which means the output tensor doesnt match the expected output in this header file. Please look into your model.

            zach

            1 Reply Last reply Reply Quote 0
            • Zachary Lowell 0Z
              Zachary Lowell 0 ModalAI Team
              last edited by

              https://gitlab.com/voxl-public/voxl-sdk/services/voxl-tflite-server/-/blob/master/include/tensor_data.h

              These are all the potential data types that voxl-tflite-server is expecting.

              S 1 Reply Last reply Reply Quote 0
              • S
                svempati @Zachary Lowell 0
                last edited by svempati

                I see, so my model is not supported by the voxl-tflite-server since it is float16 and the tflite server only supports 32 bit precision if we want to use floating point values. Am I understanding that correctly or am I missing something? Cause the default YOLOv5 model that is included in the VOXL 2 model (yolov5_float16_quant.tflite) is also of float 16 precision so I wonder how the functions in tensor_data.h handle that.

                One question, what command did you use to view these error logs from voxl-tflite server?

                Error in TensorData<float>: should not reach here
                Error in TensorData<float>: should not reach here
                Error in TensorData<float>: should not reach here
                Error in TensorData<float>: should not reach here
                Error in TensorData<float>: should not reach here
                Error in TensorData<float>: should not reach here
                Error in TensorData<float>: should not reach here
                Error in TensorData<float>: should not reach here
                
                1 Reply Last reply Reply Quote 0
                • Zachary Lowell 0Z
                  Zachary Lowell 0 ModalAI Team
                  last edited by

                  @svempati said in No detections when running custom YOLOv8 model on voxl-tflite-server:

                  I just ran voxl-tflite-server directly from the command line instead of in the background via systemd - aka run voxl-tflite-server directly on the command line. I would recommend NOT quantizing your model as the directions in the train yolov8 do not recommend that.

                  Zach

                  S 1 Reply Last reply Reply Quote 0
                  • S
                    svempati @Zachary Lowell 0
                    last edited by

                    @Zachary-Lowell-0 Got it, I will try that out and will let you know if I have any more questions. Thanks for your help!

                    1 Reply Last reply Reply Quote 0
                    • First post
                      Last post
                    Powered by NodeBB | Contributors