• Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Register
  • Login
ModalAI Forum
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
    • Register
    • Login

    Custom model deployment on voxl-tflite-server

    Software Development
    2
    2
    147
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • P
      Pratyaksh Rao
      last edited by 18 Jan 2023, 14:35

      I have trained a custom model on Tensorflow and would like to deploy the same on Voxel 2. In the Voxel 2 documentation, it is mentioned that custom models can be deployed using v2.8.0 opsets. However, my model is dependent on opsets from the latest version of Tensorflow (v2.11.0). Can the voxl-tflite-server support opsets from the latest version of Tensorflow/tflite? If not, are there any other alternatives?

      1 Reply Last reply Reply Quote 0
      • C
        Chad Sweet ModalAI Team
        last edited by 18 Jan 2023, 14:51

        You can find the code for that TensorFlow Lite build here https://gitlab.com/voxl-public/voxl-sdk/third-party/qrb5165-tflite

        There are just few patches applied to get it to build correctly. You could try rebasing it to 2.11

        1 Reply Last reply Reply Quote 0
        1 out of 2
        • First post
          1/2
          Last post
        Powered by NodeBB | Contributors