@eric
Sorry for the late response on this!!
Your best bet would be to just try it out, I can't say without trying it myself. It's possible it works, it's possible something fails. I've never had to deal with this so I can't say for certain.
It's also worth pointing out that voxl-tflite-server is more meant to be an example of how TFLite models can be run on VOXL and not an in-depth ML framework for deploying custom models. If voxl-tflite-server doesn't support it, by all means fork the repository and build out the functionality to be able to. Even better, make a pull request with that functionality and I'll test and approve it!
Hope this helps,
Thomas Patton