Need help simulating .tflite yolo models on my linux machine.
I'm new to VOXL. I have trained a yolo like model and have my yolo_custom.tflite model with me. How can I simulate it for a VOXL2mini environment.
I tried following the steps there in https://docs.modalai.com/voxl-tflite-server/ But am lost at how to setup and configure and what all are the prerequisites.
Hey thomas, Thanks for the quick reply,
So is it posible to simulate the tests in an environment without having the actual device with us? I'm asking this because you mentioned in some other thread that flashing the SDK on the device is necessary. I dont have the actual device with me right now and was hoping if I could somehow simulate the tflite models on my computer for testing purpose
So it would certainly be best to have a device with you as we don't currently officially support the ability to run
voxl-tflite-serverlocally or within a Docker container, for example. You could potentially find a way to make this work but it certainly isn't something that I've tested.
Now you can still create .tflite models on your computer and you can still use them for inference locally, there are some guides on Google's TFLite documentation for how to do this. But inside of
voxl-tflite-serverwe do some pre/post processing steps on the data which you would ultimately need to emulate to get a good idea of prediction metrics. This local inference, however, won't be useful for determining things like operational framerates for the model. For that, you'll definitely need a device to see how much the GPU/NPU can hardware accelerate the model you've created.
Hope this helps, happy to answer any more questions you may have!
Thank you for providing the necessary details. Once I acquire the device/chip for testing, I would appreciate it if you could also provide me with a comprehensive guide detailing the steps required to simulate and test TensorFlow Lite models in the VOXL environment. I have the tflite files ready with me.