ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Custom model deployment on voxl-tflite-server

    Software Development
    2
    2
    109
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Pratyaksh RaoP
      Pratyaksh Rao
      last edited by

      I have trained a custom model on Tensorflow and would like to deploy the same on Voxel 2. In the Voxel 2 documentation, it is mentioned that custom models can be deployed using v2.8.0 opsets. However, my model is dependent on opsets from the latest version of Tensorflow (v2.11.0). Can the voxl-tflite-server support opsets from the latest version of Tensorflow/tflite? If not, are there any other alternatives?

      1 Reply Last reply Reply Quote 0
      • Chad SweetC
        Chad Sweet ModalAI Team
        last edited by

        You can find the code for that TensorFlow Lite build here https://gitlab.com/voxl-public/voxl-sdk/third-party/qrb5165-tflite

        There are just few patches applied to get it to build correctly. You could try rebasing it to 2.11

        1 Reply Last reply Reply Quote 0
        • First post
          Last post
        Powered by NodeBB | Contributors