ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Upgrade Tensorflow Version tflite-server

    FAQs
    3
    38
    3440
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Eric KatzfeyE
      Eric Katzfey ModalAI Team @Philemon Benner
      last edited by

      @Philemon-Benner It looks like voxl-uvc-server is working fine. The error messages seem to be related to voxl-tflite-server and not voxl-uvc-server. Can you use voxl-streamer to stream the output of voxl-uvc-server to VLC and see if there are still green lines? What are you currently using to stream the output of voxl-uvc-server?

      Philemon BennerP 2 Replies Last reply Reply Quote 0
      • Eric KatzfeyE
        Eric Katzfey ModalAI Team @Philemon Benner
        last edited by

        @Philemon-Benner I'm guessing that there may be a version mismatch somewhere in there.

        1 Reply Last reply Reply Quote 0
        • Philemon BennerP
          Philemon Benner @Eric Katzfey
          last edited by

          @Eric-Katzfey Yeah i will try that today. I'm currently showing the stream with voxl Portal but with the voxl-tflite-server pipe. So voxl-uvc-server --> voxl-tflite-server --> voxl-portal(green lines output).

          Philemon BennerP 1 Reply Last reply Reply Quote 0
          • Philemon BennerP
            Philemon Benner @Philemon Benner
            last edited by Philemon Benner

            @Eric-Katzfey what a exactly do you mean with version mismatch, and where. The thing i could think of is tflite server 0.2.0 as seen above in the first try of installation i had some errors because of existing dependencies.

            1 Reply Last reply Reply Quote 0
            • Philemon BennerP
              Philemon Benner @Eric Katzfey
              last edited by

              @Eric-Katzfey Ok so i tried using it with voxl-streamer. It's working completely fine. Any suggestions?

              1 Reply Last reply Reply Quote 0
              • Philemon BennerP
                Philemon Benner
                last edited by

                I also tried changing the model but still the same result.

                1 Reply Last reply Reply Quote 0
                • Philemon BennerP
                  Philemon Benner
                  last edited by

                  Update:
                  So i stepped back to TF1 and trained the ssdlite_mobilenet_v2. It's working great on the drone with tflite-server 0.1.8. Thanks for all the suggestions ·@Matt-Turi . But i am looking forward to using the tflite-server 0.2.0. If you still have suggestions for fixing the green stripes in the new version please let me know, because for custom models the new tflite-version is way easier to integrate and the code is more understandable for me.

                  Eric KatzfeyE 1 Reply Last reply Reply Quote 0
                  • Eric KatzfeyE
                    Eric Katzfey ModalAI Team @Philemon Benner
                    last edited by

                    @Philemon-Benner Thanks for the follow up on this! We'll take a look.

                    1 Reply Last reply Reply Quote 0
                    • Philemon BennerP
                      Philemon Benner
                      last edited by

                      @Matt-Turi is there a way to change small stuff in tflite-server code without building from source? Or is there a way to do it over ssh?. I can't access the usb slot from the voxl of the drone. I just wanna change the box color and confidence threshhold.

                      1 Reply Last reply Reply Quote 0
                      • ?
                        A Former User
                        last edited by

                        @Philemon-Benner If you make any changes to the code you will need to rebuild it from source. Once built, you can push this package over ssh either manually or using the deploy_to_voxl.sh script that is up on dev, which has an arg for ssh and send ip. Lines 106 and 107 of the script have the scp and opkg install commands that are used.

                        Philemon BennerP 1 Reply Last reply Reply Quote 0
                        • Philemon BennerP
                          Philemon Benner @Guest
                          last edited by

                          @Matt-Turi Thank you for the fast answer. Then i will build it from source. 🤔 But could be a cool feature in the future, if Box Color, Box Thickness and Confidence Threshhold would be in the config file.

                          1 Reply Last reply Reply Quote 0
                          • ?
                            A Former User
                            last edited by

                            @Philemon-Benner great suggestion, I'll add that in soon. As for the green stripes issue seen with the dev version using a Flir Boson camera and voxl-uvc-server, I was able to successfully start and run tflite-server (mobilenetv2 w/gpu) with the same setup and only saw a few green "flickers" every few seconds due to the high input rate (60 fps) of the boson camera. I will work on some handling for this case, but have you tried your latest ssdlite_mobilenet_v2 model with the dev version of voxl-tflite-server (0.2.0)?

                            Philemon BennerP 1 Reply Last reply Reply Quote 0
                            • Philemon BennerP
                              Philemon Benner @Guest
                              last edited by

                              @Matt-Turi No i think because if you just update with opkg just the package is updatet and not the folders like the ones in /usr/bin/dnn where the models are stored. But yeah i will have a look at it. And also is the flickering happening on the in- or output? Because if it's happening in the input the inference results obviously will be less accurate, because of the green stripes.

                              1 Reply Last reply Reply Quote 0
                              • ?
                                A Former User
                                last edited by

                                @Philemon-Benner when you update with opkg, all included files will also be updated including the /usr/bin/dnn/ directory. In regards to v0.2.0, I pushed up a patch yesterday that should fix the flickering (was only on output). As long as the skip_n_frames parameter is set to at least 1 with the boson camera (since it comes in at a fixed 60fps), you should be good to go!

                                A note on inference with the Boson 640 Black-White Thermal Camera, I had some interesting results as the included mobilenet/most general models are not trained on thermal datasets, so keep that in mind when evaluating inference.

                                Philemon BennerP 1 Reply Last reply Reply Quote 0
                                • Philemon BennerP
                                  Philemon Benner @Guest
                                  last edited by

                                  @Matt-Turi yeah thanks for updating that i will definitely try it today. Yeah i know i already trained a complete dataset on thermal camera recordings and it's working like a charm. But really hot things are a source of false detection but i guess that just needs more training. Thank you for all the suggestions you made and for the fast response times. One last Question i'm really interested in how you made the boxes following so clean even with shakey Camera, as if we would inference every frame?

                                  1 Reply Last reply Reply Quote 0
                                  • Philemon BennerP
                                    Philemon Benner
                                    last edited by

                                    @Matt-Turi thanks for the update on the dev branch. It's now able to show the stream without any flickering. I also managed to add the conf_tresh and box_thickness to the config file and it works really nice.
                                    Screenshot from 2022-03-04 14-44-05.png

                                    1 Reply Last reply Reply Quote 0
                                    • Philemon BennerP
                                      Philemon Benner
                                      last edited by

                                      @Matt-Turi is there a way that i can text you in private? I have a question and video material that i want to ask for but it shouldn't be publicly shown.

                                      1 Reply Last reply Reply Quote 0
                                      • ?
                                        A Former User
                                        last edited by

                                        Hey @Philemon-Benner, great to hear everything is running smoothly! For the smooth detections/ inference every frame, try playing around with the skip_n_frames parameter in the config file, this is there so we can adjust the input data rate to match the max output rate of the model. Running with the -t flag can give you an overview of how much time is needed to process a single frame, and then you can adjust from there.

                                        Feel free to send me an email at matt.turi@modalai.com for something non-public.

                                        Philemon BennerP 1 Reply Last reply Reply Quote 0
                                        • Philemon BennerP
                                          Philemon Benner @Guest
                                          last edited by

                                          @Matt-Turi Ok nice i've send an email to you

                                          1 Reply Last reply Reply Quote 0
                                          • First post
                                            Last post
                                          Powered by NodeBB | Contributors