ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Running LightGlue Model on Docker for VOXL 2 Mini

    VOXL SDK
    2
    3
    280
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Manu Bhardwaj 0M
      Manu Bhardwaj 0
      last edited by

      Hello Thomas,

      I am currently working on a project to convert the LightGlue model from PyTorch to ONNX/TensorFlow and optimize it for Qualcomm's DSP on the VOXL 2 Mini. The goal is to achieve efficient model execution with a target performance of 1 FPS, I want to use the docker on the chip.

      Can you please guide me on this?

      Thank you!

      Best,
      Manu

      Eric KatzfeyE 1 Reply Last reply Reply Quote 0
      • Eric KatzfeyE
        Eric Katzfey ModalAI Team @Manu Bhardwaj 0
        last edited by

        @Manu-Bhardwaj-0 Do you have a specific question? Are you having trouble installing Docker?

        Manu Bhardwaj 0M 1 Reply Last reply Reply Quote 0
        • Manu Bhardwaj 0M
          Manu Bhardwaj 0 @Eric Katzfey
          last edited by Manu Bhardwaj 0

          @Eric-Katzfey I am not able to run pythorch/Onnx model on GPU or NPU, do we have any examples for that?
          I could only find the tflite examples.

          I was able to run the Light Glue model on docker with CPU.

          1 Reply Last reply Reply Quote 0
          • First post
            Last post
          Powered by NodeBB | Contributors