Source of ssdlite_mobilenet_v2_coco.tflite



  • Where did you get the ssdlite_mobilenet_v2_coco.tflite file that is used in voxl-tflite-server by default? I looked at the model zoo that Tensorflow provides (https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf1_detection_zoo.md) and found the ssdlite_mobilenet_v2_coco COCO-trained model. However, the downloaded folder of this model does not contain a .tflite file. There are mobile models in the model zoo mentioned that do have .tflite files but none of them have the exact .tflite file name of ssdlite_mobilenet_v2_coco.

    It'd be nice to know where the .tflite file for MobileNet comes from to have a good starting point to look into using other models on the drone.


  • Dev Team

    @Steve-Arias said in Source of ssdlite_mobilenet_v2_coco.tflite:

    re are mobile models in the model zoo mentioned that do have .tflite files but none of them have the exact .tflite file name of ssdlite_mobilenet_v2_coc

    @Steve-Arias I will get some better documentation up on this subject soon, but for now please see the official Tensorflow guide on model conversion for TensorFlow 1.x models. The ssdlite_mobilenet_v2_coco.tflite was manually converted using this guide.

    The functions of interest from the python api for this model specifically are:

    tf.compat.v1.lite.TFLiteConverter.from_saved_model():
    tf.compat.v1.lite.TFLiteConverter.from_frozen_graph():  
    

    as the TF1 detection zoo includes both the frozen inference graph as well as a saved model.

    We also set these options within the converter to enable inference on the gpu. Below is an example for a mobilenet v1 conversion from a frozen graph:

    import tensorflow as tf
    
    converter =  tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(
      graph_def_file = '/path/to/.pb/file/tflite_graph.pb', 
      input_arrays = ['normalized_input_image_tensor'],
      input_shapes={'normalized_input_image_tensor': [1,300,300,3]},
      output_arrays = ['TFLite_Detection_PostProcess', 'TFLite_Detection_PostProcess:1', 'TFLite_Detection_PostProcess:2', 'TFLite_Detection_PostProcess:3'] 
    )
    
    // IMPORTANT: FLAGS MUST BE SET BELOW //
    converter.use_experimental_new_converter = True
    converter.allow_custom_ops = True
    converter.target_spec.supported_types = [tf.float16]
    
    tflite_model = converter.convert()
    with tf.io.gfile.GFile('mobilenet_converted.tflite', 'wb') as f:
      f.write(tflite_model)
    

Log in to reply