Source of ssdlite_mobilenet_v2_coco.tflite



  • Where did you get the ssdlite_mobilenet_v2_coco.tflite file that is used in voxl-tflite-server by default? I looked at the model zoo that Tensorflow provides (https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf1_detection_zoo.md) and found the ssdlite_mobilenet_v2_coco COCO-trained model. However, the downloaded folder of this model does not contain a .tflite file. There are mobile models in the model zoo mentioned that do have .tflite files but none of them have the exact .tflite file name of ssdlite_mobilenet_v2_coco.

    It'd be nice to know where the .tflite file for MobileNet comes from to have a good starting point to look into using other models on the drone.



  • @Steve-Arias said in Source of ssdlite_mobilenet_v2_coco.tflite:

    re are mobile models in the model zoo mentioned that do have .tflite files but none of them have the exact .tflite file name of ssdlite_mobilenet_v2_coc

    @Steve-Arias I will get some better documentation up on this subject soon, but for now please see the official Tensorflow guide on model conversion for TensorFlow 1.x models. The ssdlite_mobilenet_v2_coco.tflite was manually converted using this guide.

    The functions of interest from the python api for this model specifically are:

    tf.compat.v1.lite.TFLiteConverter.from_saved_model():
    tf.compat.v1.lite.TFLiteConverter.from_frozen_graph():  
    

    as the TF1 detection zoo includes both the frozen inference graph as well as a saved model.

    We also set these options within the converter to enable inference on the gpu. Below is an example for a mobilenet v1 conversion from a frozen graph:

    import tensorflow as tf
    
    converter =  tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(
      graph_def_file = '/path/to/.pb/file/tflite_graph.pb', 
      input_arrays = ['normalized_input_image_tensor'],
      input_shapes={'normalized_input_image_tensor': [1,300,300,3]},
      output_arrays = ['TFLite_Detection_PostProcess', 'TFLite_Detection_PostProcess:1', 'TFLite_Detection_PostProcess:2', 'TFLite_Detection_PostProcess:3'] 
    )
    
    // IMPORTANT: FLAGS MUST BE SET BELOW //
    converter.use_experimental_new_converter = True
    converter.allow_custom_ops = True
    converter.target_spec.supported_types = [tf.float16]
    
    tflite_model = converter.convert()
    with tf.io.gfile.GFile('mobilenet_converted.tflite', 'wb') as f:
      f.write(tflite_model)
    


  • This post is deleted!


  • @Matt-Turi Do you know which of the models the original ssdlite_mobilenet_v2_coco.tflite is from? Could you link to it? We've tried using the conversion instructions from the docs on a variety of the models on the TF1 model zoo (link in the original post), and using TF2.2.3, they still either fail at conversion, get a seg fault with voxl-tflite-server, or they run on CPU instead of GPU, which causes some lag or just causes the Voxl to crash.



  • The source of the original ssdlite_mobilenet_v2_coco.tflite model is here: http://download.tensorflow.org/models/object_detection/ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz

    In order to help further, I would need to see conversion errors/segfaults/crash logs to help diagnose. The tensorflow repo's issue section is a good resource if you have any conversion errors, and you can read the docs on the converter here https://www.tensorflow.org/lite/convert. If the conversion instructions above are not working, you can try using the command line tflite_convert like in the answer of this so post.



  • @Matt-Turi Thanks! I had tried this one previously, but I'll give it another go with your recommendations.


Log in to reply