ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Voxl-tflite-server error while using DeepLabV3

    VOXL
    2
    5
    306
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Maxim TempM
      Maxim Temp
      last edited by

      Hello there,
      I have been trying to run the voxl-tflite-server (apq8096) on the GPU with the DeepLabV3 model. It worked well on the CPU, without any errors. However, when im trying to run it on the GPU, i don`t get any frames and i get the following errors:
      51e619bf-1a51-42b9-9bcc-8d160bfd8b90-image.png

      And then the same repeating lines over and over again:
      14792a1a-05b0-4a59-91b5-20e7fffaea03-image.png

      1 Reply Last reply Reply Quote 0
      • ?
        A Former User
        last edited by

        Hey @Maxim-Temp,

        The deeplabv3 model is not officially supported for the apq8096 platform because of issues like this that arise with some of the newer, larger models running with an older version of tensorflow lite. The first set of errors is expected, as these Ops are not supported for the gpu delegate in tensorflow 2.2.3, and they should just fall back to the cpu.

        The next set of errors (as per docs here: https://registry.khronos.org/OpenCL/sdk/1.0/docs/man/xhtml/clEnqueueReadBuffer.html) likely means that tensorflow is incorrectly passing an invalid buffer region to OpenCL, which will be difficult to chase down. This error is rising here, either line 124 or 149 in apq8096-tflite/tensorflow/tensorflow/lite/delegates/gpu/cl/cl_command_queue.cc if you would like to investigate further.

        Maxim TempM 1 Reply Last reply Reply Quote 0
        • Maxim TempM
          Maxim Temp @Guest
          last edited by Maxim Temp

          @Matt-Turi Thank you for your quick response. I will investigate further!

          1 Reply Last reply Reply Quote 0
          • ?
            A Former User
            last edited by

            Hey @Maxim-Temp, quick update.

            As of a few minutes ago, the latest voxl-tflite-server (v0.3.0) is now running with tensorflow v2.8.0 for apq8096. This means that you can now run any of the models that were included with qrb5165 (including Deeplabv3) on VOXL without dealing with these headaches. You can pull this from the dev apq8096 packages repo, or directly from here: http://voxl-packages.modalai.com/dists/apq8096/dev/binary-arm64/voxl-tflite-server_0.3.0_202207212215.ipk

            Maxim TempM 1 Reply Last reply Reply Quote 0
            • Maxim TempM
              Maxim Temp @Guest
              last edited by Maxim Temp

              @Matt-Turi Wow, thank you for updating me. I will try these models now 🙂

              1 Reply Last reply Reply Quote 0
              • First post
                Last post
              Powered by NodeBB | Contributors