Skip to content

Conversation

@nattse
Copy link
Contributor

@nattse nattse commented Jan 20, 2025

Fixes #2840 where get_detector_inference_runner was not able to get device from model_cfg, leaving all detector inference to be done on CPU

Copy link
Contributor

@n-poulsen n-poulsen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @nattse!

@n-poulsen n-poulsen merged commit db23d67 into DeepLabCut:pytorch_dlc Jan 21, 2025
@n-poulsen n-poulsen changed the title Update utils.py get_detector_inference_runner: resolve the device from the model config when None Jan 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants