ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Are SELECT_TF_OPS supported on VOXL2?

    Ask your questions right here!
    tflite tflite-server voxl2
    2
    2
    192
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • E
      eric
      last edited by

      I'm converting a custom keras model to tflite. Unfortunately, this model requires GatherND, which is not included in the built-in TF lite OPS.

      If I add SELECT_TF_OPS to the supported ops list, then my custom TFlite model builds locally:

      converter.target_spec.supported_ops = [
          tf.lite.OpsSet.TFLITE_BUILTINS,
          tf.lite.OpsSet.SELECT_TF_OPS
      ]
      

      Will this model (with the SELECT_TF_OPS flag) work with tflite server for VOXL2, or can I only use the built-in tflite ops?

      ? 1 Reply Last reply Reply Quote 0
      • ?
        A Former User @eric
        last edited by

        @eric

        Sorry for the late response on this!!

        Your best bet would be to just try it out, I can't say without trying it myself. It's possible it works, it's possible something fails. I've never had to deal with this so I can't say for certain.

        It's also worth pointing out that voxl-tflite-server is more meant to be an example of how TFLite models can be run on VOXL and not an in-depth ML framework for deploying custom models. If voxl-tflite-server doesn't support it, by all means fork the repository and build out the functionality to be able to. Even better, make a pull request with that functionality and I'll test and approve it!

        Hope this helps,

        Thomas Patton

        1 Reply Last reply Reply Quote 0
        • First post
          Last post
        Powered by NodeBB | Contributors