Ultralytics version for voxl-tflite-server yolov8
- 
					
					
					
					
 Hello! I am trying to export a custom yolov8n model from Ultralytics to use on the VOXL2. I incorrectly developed it on a newer version of TensorFlow and have now downgraded to TensorFlow 2.8.0 to produce a correct model. Upon trying to convert even a default, pre-trained yolov8n model I am getting export errors (for example not having dtensors available). My question is where was the model loaded in the voxl-tflite-server repo produced from? Was Ultralytics used, if so what version? It not Ultralytics can the source training code for the uploaded weights be provided? 
- 
					
					
					
					
 @atomsmasher9 said in Ultralytics version for voxl-tflite-server yolov8: Reply Hi @atomsmasher9 - so ultralytics is leveraged for training - have you followed the instructions within this repository? https://gitlab.com/voxl-public/support/voxl-train-yolov8/-/tree/master?ref_type=heads Currently I have some updated written for this as well - so for example in export.py, this line: model.export("tflite") --> model.export(model = "tflite"), but I have yet to make the PR or commit for it - the README is relatively straight forward for how you can build a model leveraging docker, an nvidia gpu, and ultralytics to create the custom model. I was able to do these below and have a successful model created and uploaded to the voxl2 - ensure you place the model in /usr/bin/dnn directory alongside the labels.txt file to ensure you have the right labels - also you can manually update the file path in the conf file in /etc/modalai/voxl-tflite-server.conf.