As a demonstration of that, I made a small Pyhton Flask web service that is suitable to run on the Raspberry Pi. Using a basic web UI you can PUSH images to it to do object detection and classification.
While processing some validation images from the COCO dataset, the observed inference speed is about 400ms, do add another 150 ms to post-process the results. This makes about 550 ms for the full object detection, which sounds pretty acceptable to me. Given it runs on a Raspberry Pi4 and I made the postprocessing code to be readable, not to have optimal performance.
The full source code is available for download on github
https://github.com/brunokeymolen/movidius-inference-server
No comments:
Post a Comment