tflite-server with custom model?
-
Hi Chad and team,
I am trying to load a custom object detection .tflite model on the VOXL 2 and get it to work on voxl-tflite-server to read in the object detection data. Should I
- modify the code for main.cpp and inference_helper.h for voxl-tflite-server
or - simply go into /etc/modalai/voxl-tflite-server.conf and ask it to read in my model?
I went to TensorFlow Hub, downloaded 035-128 classification variation of mobilenet_v2, float16 quantized the model, and then converted it with the following code:
`import os
import tensorflow as tfSet name for the output .tflite file
tf_lite_model_file_name = "custom_net_01.tflite"
Provide the path to the directory where the saved model is located
saved_model_dir = "./"
Convert the model
tflite_converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
print('Path successfully found!')
tflite_model = tflite_converter.convert()Write the TFLite model to a file
with open(tf_lite_model_file_name, "wb") as f:
f.write(tflite_model)`Once I had custom_net_01.tflite, I put the model back on the VOXL 2. More specifically, I moved the model to /usr/bin/dnn/ where all the other .tflite models are located. I then went to etc/modalai/voxl-tflite-server.conf and edit the .conf files so that the input for
"model":
was"/usr/bin/dnn/custom_net_01.tflite",
. I then ran the commandssystemctl stop voxl-tflite-server
,systemctl disable voxl-tflite-server
,systemctl enable voxl-tflite-server
, andsystemctl start voxl-tflite-server
, as a way to source voxl-configure-tflite. When I ran saw the camera feed on voxl-portal, I could seethe camera feed with no problem, but it said that tflite had an unknown source on the portal. When I then reconfigured voxl-tflite-server to us the mobilenet option, tflite was able to work perfectly fine and I was able to see the bounding boxes from the model.When I went to the Deep Learning with VOXL-TFLite-Server webpage I tried to run the tflite converstion and quantization code, but it did not work for me as that code is outdated (tensorflow v2.8 instead of 2.16 or 2.17). Then when I went down to the Implementing your Model in voxl-tflite-server section, clicked on the InferenceHelper GitLab page, read the file, but then wasn't sure what to do next. (To be fair, I write in Python, not C++).
I on the voxl-tflite-server GitLab page for main.cpp it looks like assigns specific post-processing functions and sometimes normalization and label usage for each of the models already loaded on the VOXL 2 in /usr/bin/dnn. After looking at main.cpp, I felt like the correct thing to do was to edit main.cpp, inference_helper.h, and voxl-configure-tflite all to include my new model and tell main.cpp which post-processing and normalization to use until the model worked, but I was told that with C++ code you need to recompile the code.
After having read the Implementing your Model in voxl-tflite-server section on the Deep Learning with VOXL-TFLite-Server webpage, I wasn't entirely sure what to do next.
What should I do with inferencehelper.h? What next step can I take to get voxl-tflite-server to use my custom model?
Thanks!
- modify the code for main.cpp and inference_helper.h for voxl-tflite-server
-
Good morning @Chad-Sweet, @modaltb, and team,
Following up on my post from yesterday, I created a back-up folder of dnn (located in /usr/bin/dnn), went into the new folder, renamed ssdlite_mobilenet_v2_coco.tflite to ssdlite_mobilenet_v2_coco.bak.tflite, renamed custom_net_01.tflite to ssdlite_mobilenet_v2_coco.tflite, and restarted voxl-tflite-server. When I opened the web-portal I was able to not only view the camera feed, tflite was also showing a camera feed (something which hadn't happened before), but it wasn't drawing new bounding boxes.
Thankfully, it feels like I'm getting really close to cracking the code on this, but I am still just missing the last step of getting voxl-tflite-server to actually draw bounding boxes using my custom model. I look forward to hearing from you all!
Best wishes,