ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Source of ssdlite_mobilenet_v2_coco.tflite

    Ask your questions right here!
    3
    6
    557
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Steve AriasS
      Steve Arias
      last edited by Steve Arias

      Where did you get the ssdlite_mobilenet_v2_coco.tflite file that is used in voxl-tflite-server by default? I looked at the model zoo that Tensorflow provides (https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf1_detection_zoo.md) and found the ssdlite_mobilenet_v2_coco COCO-trained model. However, the downloaded folder of this model does not contain a .tflite file. There are mobile models in the model zoo mentioned that do have .tflite files but none of them have the exact .tflite file name of ssdlite_mobilenet_v2_coco.

      It'd be nice to know where the .tflite file for MobileNet comes from to have a good starting point to look into using other models on the drone.

      1 Reply Last reply Reply Quote 0
      • ?
        A Former User
        last edited by A Former User

        @Steve-Arias said in Source of ssdlite_mobilenet_v2_coco.tflite:

        re are mobile models in the model zoo mentioned that do have .tflite files but none of them have the exact .tflite file name of ssdlite_mobilenet_v2_coc

        @Steve-Arias I will get some better documentation up on this subject soon, but for now please see the official Tensorflow guide on model conversion for TensorFlow 1.x models. The ssdlite_mobilenet_v2_coco.tflite was manually converted using this guide.

        The functions of interest from the python api for this model specifically are:

        tf.compat.v1.lite.TFLiteConverter.from_saved_model():
        tf.compat.v1.lite.TFLiteConverter.from_frozen_graph():  
        

        as the TF1 detection zoo includes both the frozen inference graph as well as a saved model.

        We also set these options within the converter to enable inference on the gpu. Below is an example for a mobilenet v1 conversion from a frozen graph:

        import tensorflow as tf
        
        converter =  tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(
          graph_def_file = '/path/to/.pb/file/tflite_graph.pb', 
          input_arrays = ['normalized_input_image_tensor'],
          input_shapes={'normalized_input_image_tensor': [1,300,300,3]},
          output_arrays = ['TFLite_Detection_PostProcess', 'TFLite_Detection_PostProcess:1', 'TFLite_Detection_PostProcess:2', 'TFLite_Detection_PostProcess:3'] 
        )
        
        // IMPORTANT: FLAGS MUST BE SET BELOW //
        converter.use_experimental_new_converter = True
        converter.allow_custom_ops = True
        converter.target_spec.supported_types = [tf.float16]
        
        tflite_model = converter.convert()
        with tf.io.gfile.GFile('mobilenet_converted.tflite', 'wb') as f:
          f.write(tflite_model)
        
        1 Reply Last reply Reply Quote 1
        • Steve AriasS
          Steve Arias
          last edited by Steve Arias

          This post is deleted!
          1 Reply Last reply Reply Quote 0
          • S
            sarahl
            last edited by

            @Matt-Turi Do you know which of the models the original ssdlite_mobilenet_v2_coco.tflite is from? Could you link to it? We've tried using the conversion instructions from the docs on a variety of the models on the TF1 model zoo (link in the original post), and using TF2.2.3, they still either fail at conversion, get a seg fault with voxl-tflite-server, or they run on CPU instead of GPU, which causes some lag or just causes the Voxl to crash.

            1 Reply Last reply Reply Quote 0
            • ?
              A Former User
              last edited by

              The source of the original ssdlite_mobilenet_v2_coco.tflite model is here: http://download.tensorflow.org/models/object_detection/ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz

              In order to help further, I would need to see conversion errors/segfaults/crash logs to help diagnose. The tensorflow repo's issue section is a good resource if you have any conversion errors, and you can read the docs on the converter here https://www.tensorflow.org/lite/convert. If the conversion instructions above are not working, you can try using the command line tflite_convert like in the answer of this so post.

              S 1 Reply Last reply Reply Quote 0
              • S
                sarahl @Guest
                last edited by

                @Matt-Turi Thanks! I had tried this one previously, but I'll give it another go with your recommendations.

                1 Reply Last reply Reply Quote 0
                • First post
                  Last post
                Powered by NodeBB | Contributors