hyperpose
This docker image contains the built HyperPose inference library along with the development environment.
The entry point is hyperpose-cli.
Current docker version is based on CUDA10.0 and 10.2, make sure your NVidia driver version is compatible.
Note that if your NVIDIA Driver version is newer than 440.33, you can try CUDA 10.2 images (e.g., tensorlayer/hyperpose:v2.2.0-cu10.2). Otherwise please use CUDA 10 images or default/latest one.
# [Example 1]: Doing inference on given video, copy the output.avi to the local path.
docker run --name quick-start --gpus all tensorlayer/hyperpose --runtime=stream
docker cp quick-start:/hyperpose/build/output.avi .
docker rm quick-start
# [Example 2](X11 server required to see the imshow window): Real-time inference.
# You may need to install X11 server locally:
# sudo apt install xorg openbox xauth
xhost +; docker run --rm --gpus all -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix tensorlayer/hyperpose --imshow
# [Example 3]: Camera + imshow window
xhost +; docker run --name pose-camera --rm --gpus all -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix --device=/dev/video0:/dev/video0 tensorlayer/hyperpose --source=camera --imshow
# To quit this image, please type `docker kill pose-camera` in another terminal.
# [Dive into the image]
xhost +; docker run --rm --gpus all -it -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix --device=/dev/video0:/dev/video0 --entrypoint /bin/bash tensorlayer/hyperpose
# For users that cannot access a camera or X11 server. You may also use:
# docker run --rm --gpus all -it --entrypoint /bin/bash tensorlayer/hyperpose
If you don't have on-device cameras, you can remove
--source=camerato do inference on the test video.If you are using a server or you don't want to see the "imshow" window, please set
--imshow=0.
For more details, please check here.
Content type
Image
Digest
Size
4.2 GB
Last updated
over 4 years ago
Requires Docker Desktop 4.37.1 or later.