• Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Register
  • Login
ModalAI Forum
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
    • Register
    • Login

    Running LightGlue Model on Docker for VOXL 2 Mini

    VOXL SDK
    2
    3
    281
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • M
      Manu Bhardwaj 0
      last edited by 24 May 2024, 17:13

      Hello Thomas,

      I am currently working on a project to convert the LightGlue model from PyTorch to ONNX/TensorFlow and optimize it for Qualcomm's DSP on the VOXL 2 Mini. The goal is to achieve efficient model execution with a target performance of 1 FPS, I want to use the docker on the chip.

      Can you please guide me on this?

      Thank you!

      Best,
      Manu

      E 1 Reply Last reply 24 May 2024, 17:22 Reply Quote 0
      • E
        Eric Katzfey ModalAI Team @Manu Bhardwaj 0
        last edited by 24 May 2024, 17:22

        @Manu-Bhardwaj-0 Do you have a specific question? Are you having trouble installing Docker?

        M 1 Reply Last reply 8 Jun 2024, 19:24 Reply Quote 0
        • M
          Manu Bhardwaj 0 @Eric Katzfey
          last edited by Manu Bhardwaj 0 8 Jun 2024, 19:25 8 Jun 2024, 19:24

          @Eric-Katzfey I am not able to run pythorch/Onnx model on GPU or NPU, do we have any examples for that?
          I could only find the tflite examples.

          I was able to run the Light Glue model on docker with CPU.

          1 Reply Last reply Reply Quote 0
          • First post
            Last post
          Powered by NodeBB | Contributors