While installing TAO, ERROR: nvidia-docker not found

Trying to install TAO following instructions for beginners here: Beginners — Tao Toolkit

Went all the way through step “Installing TAO Launcher” bash setup/quickstart_launcher.sh --install

I was able to install but got an error on the installation output: ERROR: nvidia-docker not found.

I’m runing on Ubuntu 22.04.5 LTS on Windows 11.

Here is the entire output after running “bash setup/quickstart_launcher.sh --install”:

INFO: Check requirements
INFO: Checking Python installation
INFO: python3 found.
INFO: Python version: Python 3.10.18
INFO: pip3 found.
INFO: Pip version: pip 25.1 from /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages/pip (python 3.10)
INFO: Docker found. Checking additional requirements for docker.
INFO: Checking nvidia-docker2 installation
ERROR: nvidia-docker not found.
INFO: NGC CLI found.
INFO: NGC CLI 3.160.1
INFO: Requirements check satisfied. Installing TAO Toolkit.
By installing the TAO Toolkit CLI, you accept the terms and conditions of this license: https://developer.nvidia.com/tao-toolkit-software-license-agreement
Would you like to continue? (y/n): y
INFO: EULA accepted.
INFO: Installing TAO Toolkit CLI
INFO: Jupyter wasn't found.
INFO:  Installing Jupyter
Collecting jupyter
  Using cached jupyter-1.1.1-py2.py3-none-any.whl.metadata (2.0 kB)
Collecting notebook (from jupyter)
  Using cached notebook-7.4.4-py3-none-any.whl.metadata (10 kB)
Collecting jupyter-console (from jupyter)
  Using cached jupyter_console-6.6.3-py3-none-any.whl.metadata (5.8 kB)
Collecting nbconvert (from jupyter)
  Using cached nbconvert-7.16.6-py3-none-any.whl.metadata (8.5 kB)
Collecting ipykernel (from jupyter)
  Using cached ipykernel-6.29.5-py3-none-any.whl.metadata (6.3 kB)
Collecting ipywidgets (from jupyter)
  Using cached ipywidgets-8.1.7-py3-none-any.whl.metadata (2.4 kB)
Collecting jupyterlab (from jupyter)
  Using cached jupyterlab-4.4.4-py3-none-any.whl.metadata (16 kB)
Collecting comm>=0.1.1 (from ipykernel->jupyter)
  Using cached comm-0.2.2-py3-none-any.whl.metadata (3.7 kB)
Collecting debugpy>=1.6.5 (from ipykernel->jupyter)
  Using cached debugpy-1.8.14-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.3 kB)
Collecting ipython>=7.23.1 (from ipykernel->jupyter)
  Using cached ipython-8.37.0-py3-none-any.whl.metadata (5.1 kB)
Collecting jupyter-client>=6.1.12 (from ipykernel->jupyter)
  Using cached jupyter_client-8.6.3-py3-none-any.whl.metadata (8.3 kB)
Collecting jupyter-core!=5.0.*,>=4.12 (from ipykernel->jupyter)
  Using cached jupyter_core-5.8.1-py3-none-any.whl.metadata (1.6 kB)
Collecting matplotlib-inline>=0.1 (from ipykernel->jupyter)
  Using cached matplotlib_inline-0.1.7-py3-none-any.whl.metadata (3.9 kB)
Collecting nest-asyncio (from ipykernel->jupyter)
  Using cached nest_asyncio-1.6.0-py3-none-any.whl.metadata (2.8 kB)
Collecting packaging (from ipykernel->jupyter)
  Using cached packaging-25.0-py3-none-any.whl.metadata (3.3 kB)
Collecting psutil (from ipykernel->jupyter)
  Using cached psutil-7.0.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (22 kB)
Collecting pyzmq>=24 (from ipykernel->jupyter)
  Using cached pyzmq-27.0.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.metadata (6.0 kB)
Collecting tornado>=6.1 (from ipykernel->jupyter)
  Using cached tornado-6.5.1-cp39-abi3-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.8 kB)
Collecting traitlets>=5.4.0 (from ipykernel->jupyter)
  Using cached traitlets-5.14.3-py3-none-any.whl.metadata (10 kB)
Collecting decorator (from ipython>=7.23.1->ipykernel->jupyter)
  Using cached decorator-5.2.1-py3-none-any.whl.metadata (3.9 kB)
Collecting exceptiongroup (from ipython>=7.23.1->ipykernel->jupyter)
  Using cached exceptiongroup-1.3.0-py3-none-any.whl.metadata (6.7 kB)
Collecting jedi>=0.16 (from ipython>=7.23.1->ipykernel->jupyter)
  Using cached jedi-0.19.2-py2.py3-none-any.whl.metadata (22 kB)
Collecting pexpect>4.3 (from ipython>=7.23.1->ipykernel->jupyter)
  Using cached pexpect-4.9.0-py2.py3-none-any.whl.metadata (2.5 kB)
Collecting prompt_toolkit<3.1.0,>=3.0.41 (from ipython>=7.23.1->ipykernel->jupyter)
  Using cached prompt_toolkit-3.0.51-py3-none-any.whl.metadata (6.4 kB)
Collecting pygments>=2.4.0 (from ipython>=7.23.1->ipykernel->jupyter)
  Using cached pygments-2.19.2-py3-none-any.whl.metadata (2.5 kB)
Collecting stack_data (from ipython>=7.23.1->ipykernel->jupyter)
  Using cached stack_data-0.6.3-py3-none-any.whl.metadata (18 kB)
Collecting typing_extensions>=4.6 (from ipython>=7.23.1->ipykernel->jupyter)
  Using cached typing_extensions-4.14.1-py3-none-any.whl.metadata (3.0 kB)
Collecting wcwidth (from prompt_toolkit<3.1.0,>=3.0.41->ipython>=7.23.1->ipykernel->jupyter)
  Using cached wcwidth-0.2.13-py2.py3-none-any.whl.metadata (14 kB)
Collecting parso<0.9.0,>=0.8.4 (from jedi>=0.16->ipython>=7.23.1->ipykernel->jupyter)
  Using cached parso-0.8.4-py2.py3-none-any.whl.metadata (7.7 kB)
Collecting python-dateutil>=2.8.2 (from jupyter-client>=6.1.12->ipykernel->jupyter)
  Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl.metadata (8.4 kB)
Collecting platformdirs>=2.5 (from jupyter-core!=5.0.*,>=4.12->ipykernel->jupyter)
  Using cached platformdirs-4.3.8-py3-none-any.whl.metadata (12 kB)
Collecting ptyprocess>=0.5 (from pexpect>4.3->ipython>=7.23.1->ipykernel->jupyter)
  Using cached ptyprocess-0.7.0-py2.py3-none-any.whl.metadata (1.3 kB)
Collecting six>=1.5 (from python-dateutil>=2.8.2->jupyter-client>=6.1.12->ipykernel->jupyter)
  Using cached six-1.17.0-py2.py3-none-any.whl.metadata (1.7 kB)
Collecting widgetsnbextension~=4.0.14 (from ipywidgets->jupyter)
  Using cached widgetsnbextension-4.0.14-py3-none-any.whl.metadata (1.6 kB)
Collecting jupyterlab_widgets~=3.0.15 (from ipywidgets->jupyter)
  Using cached jupyterlab_widgets-3.0.15-py3-none-any.whl.metadata (20 kB)
Collecting async-lru>=1.0.0 (from jupyterlab->jupyter)
  Using cached async_lru-2.0.5-py3-none-any.whl.metadata (4.5 kB)
Collecting httpx>=0.25.0 (from jupyterlab->jupyter)
  Using cached httpx-0.28.1-py3-none-any.whl.metadata (7.1 kB)
Collecting jinja2>=3.0.3 (from jupyterlab->jupyter)
  Using cached jinja2-3.1.6-py3-none-any.whl.metadata (2.9 kB)
Collecting jupyter-lsp>=2.0.0 (from jupyterlab->jupyter)
  Using cached jupyter_lsp-2.2.5-py3-none-any.whl.metadata (1.8 kB)
Collecting jupyter-server<3,>=2.4.0 (from jupyterlab->jupyter)
  Using cached jupyter_server-2.16.0-py3-none-any.whl.metadata (8.5 kB)
Collecting jupyterlab-server<3,>=2.27.1 (from jupyterlab->jupyter)
  Using cached jupyterlab_server-2.27.3-py3-none-any.whl.metadata (5.9 kB)
Collecting notebook-shim>=0.2 (from jupyterlab->jupyter)
  Using cached notebook_shim-0.2.4-py3-none-any.whl.metadata (4.0 kB)
Requirement already satisfied: setuptools>=41.1.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from jupyterlab->jupyter) (78.1.1)
Collecting tomli>=1.2.2 (from jupyterlab->jupyter)
  Using cached tomli-2.2.1-py3-none-any.whl.metadata (10 kB)
Collecting anyio>=3.1.0 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached anyio-4.9.0-py3-none-any.whl.metadata (4.7 kB)
Collecting argon2-cffi>=21.1 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached argon2_cffi-25.1.0-py3-none-any.whl.metadata (4.1 kB)
Collecting jupyter-events>=0.11.0 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached jupyter_events-0.12.0-py3-none-any.whl.metadata (5.8 kB)
Collecting jupyter-server-terminals>=0.4.4 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached jupyter_server_terminals-0.5.3-py3-none-any.whl.metadata (5.6 kB)
Collecting nbformat>=5.3.0 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached nbformat-5.10.4-py3-none-any.whl.metadata (3.6 kB)
Collecting overrides>=5.0 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached overrides-7.7.0-py3-none-any.whl.metadata (5.8 kB)
Collecting prometheus-client>=0.9 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached prometheus_client-0.22.1-py3-none-any.whl.metadata (1.9 kB)
Collecting send2trash>=1.8.2 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached Send2Trash-1.8.3-py3-none-any.whl.metadata (4.0 kB)
Collecting terminado>=0.8.3 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached terminado-0.18.1-py3-none-any.whl.metadata (5.8 kB)
Collecting websocket-client>=1.7 (from jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached websocket_client-1.8.0-py3-none-any.whl.metadata (8.0 kB)
Collecting babel>=2.10 (from jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached babel-2.17.0-py3-none-any.whl.metadata (2.0 kB)
Collecting json5>=0.9.0 (from jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached json5-0.12.0-py3-none-any.whl.metadata (36 kB)
Collecting jsonschema>=4.18.0 (from jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached jsonschema-4.24.0-py3-none-any.whl.metadata (7.8 kB)
Collecting requests>=2.31 (from jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached requests-2.32.4-py3-none-any.whl.metadata (4.9 kB)
Collecting idna>=2.8 (from anyio>=3.1.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached idna-3.10-py3-none-any.whl.metadata (10 kB)
Collecting sniffio>=1.1 (from anyio>=3.1.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached sniffio-1.3.1-py3-none-any.whl.metadata (3.9 kB)
Collecting argon2-cffi-bindings (from argon2-cffi>=21.1->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.7 kB)
Collecting certifi (from httpx>=0.25.0->jupyterlab->jupyter)
  Using cached certifi-2025.7.9-py3-none-any.whl.metadata (2.4 kB)
Collecting httpcore==1.* (from httpx>=0.25.0->jupyterlab->jupyter)
  Using cached httpcore-1.0.9-py3-none-any.whl.metadata (21 kB)
Collecting h11>=0.16 (from httpcore==1.*->httpx>=0.25.0->jupyterlab->jupyter)
  Using cached h11-0.16.0-py3-none-any.whl.metadata (8.3 kB)
Collecting MarkupSafe>=2.0 (from jinja2>=3.0.3->jupyterlab->jupyter)
  Using cached MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.0 kB)
Collecting attrs>=22.2.0 (from jsonschema>=4.18.0->jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached attrs-25.3.0-py3-none-any.whl.metadata (10 kB)
Collecting jsonschema-specifications>=2023.03.6 (from jsonschema>=4.18.0->jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached jsonschema_specifications-2025.4.1-py3-none-any.whl.metadata (2.9 kB)
Collecting referencing>=0.28.4 (from jsonschema>=4.18.0->jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached referencing-0.36.2-py3-none-any.whl.metadata (2.8 kB)
Collecting rpds-py>=0.7.1 (from jsonschema>=4.18.0->jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached rpds_py-0.26.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.2 kB)
Collecting python-json-logger>=2.0.4 (from jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached python_json_logger-3.3.0-py3-none-any.whl.metadata (4.0 kB)
Collecting pyyaml>=5.3 (from jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)
Collecting rfc3339-validator (from jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached rfc3339_validator-0.1.4-py2.py3-none-any.whl.metadata (1.5 kB)
Collecting rfc3986-validator>=0.1.1 (from jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached rfc3986_validator-0.1.1-py2.py3-none-any.whl.metadata (1.7 kB)
Collecting fqdn (from jsonschema[format-nongpl]>=4.18.0->jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached fqdn-1.5.1-py3-none-any.whl.metadata (1.4 kB)
Collecting isoduration (from jsonschema[format-nongpl]>=4.18.0->jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached isoduration-20.11.0-py3-none-any.whl.metadata (5.7 kB)
Collecting jsonpointer>1.13 (from jsonschema[format-nongpl]>=4.18.0->jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached jsonpointer-3.0.0-py2.py3-none-any.whl.metadata (2.3 kB)
Collecting uri-template (from jsonschema[format-nongpl]>=4.18.0->jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached uri_template-1.3.0-py3-none-any.whl.metadata (8.8 kB)
Collecting webcolors>=24.6.0 (from jsonschema[format-nongpl]>=4.18.0->jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached webcolors-24.11.1-py3-none-any.whl.metadata (2.2 kB)
Collecting beautifulsoup4 (from nbconvert->jupyter)
  Using cached beautifulsoup4-4.13.4-py3-none-any.whl.metadata (3.8 kB)
Collecting bleach!=5.0.0 (from bleach[css]!=5.0.0->nbconvert->jupyter)
  Using cached bleach-6.2.0-py3-none-any.whl.metadata (30 kB)
Collecting defusedxml (from nbconvert->jupyter)
  Using cached defusedxml-0.7.1-py2.py3-none-any.whl.metadata (32 kB)
Collecting jupyterlab-pygments (from nbconvert->jupyter)
  Using cached jupyterlab_pygments-0.3.0-py3-none-any.whl.metadata (4.4 kB)
Collecting mistune<4,>=2.0.3 (from nbconvert->jupyter)
  Using cached mistune-3.1.3-py3-none-any.whl.metadata (1.8 kB)
Collecting nbclient>=0.5.0 (from nbconvert->jupyter)
  Using cached nbclient-0.10.2-py3-none-any.whl.metadata (8.3 kB)
Collecting pandocfilters>=1.4.1 (from nbconvert->jupyter)
  Using cached pandocfilters-1.5.1-py2.py3-none-any.whl.metadata (9.0 kB)
Collecting webencodings (from bleach!=5.0.0->bleach[css]!=5.0.0->nbconvert->jupyter)
  Using cached webencodings-0.5.1-py2.py3-none-any.whl.metadata (2.1 kB)
Collecting tinycss2<1.5,>=1.1.0 (from bleach[css]!=5.0.0->nbconvert->jupyter)
  Using cached tinycss2-1.4.0-py3-none-any.whl.metadata (3.0 kB)
Collecting fastjsonschema>=2.15 (from nbformat>=5.3.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached fastjsonschema-2.21.1-py3-none-any.whl.metadata (2.2 kB)
Collecting charset_normalizer<4,>=2 (from requests>=2.31->jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (35 kB)
Collecting urllib3<3,>=1.21.1 (from requests>=2.31->jupyterlab-server<3,>=2.27.1->jupyterlab->jupyter)
  Using cached urllib3-2.5.0-py3-none-any.whl.metadata (6.5 kB)
Collecting cffi>=1.0.1 (from argon2-cffi-bindings->argon2-cffi>=21.1->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.5 kB)
Collecting pycparser (from cffi>=1.0.1->argon2-cffi-bindings->argon2-cffi>=21.1->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached pycparser-2.22-py3-none-any.whl.metadata (943 bytes)
Collecting soupsieve>1.2 (from beautifulsoup4->nbconvert->jupyter)
  Using cached soupsieve-2.7-py3-none-any.whl.metadata (4.6 kB)
Collecting arrow>=0.15.0 (from isoduration->jsonschema[format-nongpl]>=4.18.0->jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached arrow-1.3.0-py3-none-any.whl.metadata (7.5 kB)
Collecting types-python-dateutil>=2.8.10 (from arrow>=0.15.0->isoduration->jsonschema[format-nongpl]>=4.18.0->jupyter-events>=0.11.0->jupyter-server<3,>=2.4.0->jupyterlab->jupyter)
  Using cached types_python_dateutil-2.9.0.20250708-py3-none-any.whl.metadata (1.9 kB)
Collecting executing>=1.2.0 (from stack_data->ipython>=7.23.1->ipykernel->jupyter)
  Using cached executing-2.2.0-py2.py3-none-any.whl.metadata (8.9 kB)
Collecting asttokens>=2.1.0 (from stack_data->ipython>=7.23.1->ipykernel->jupyter)
  Using cached asttokens-3.0.0-py3-none-any.whl.metadata (4.7 kB)
Collecting pure-eval (from stack_data->ipython>=7.23.1->ipykernel->jupyter)
  Using cached pure_eval-0.2.3-py3-none-any.whl.metadata (6.3 kB)
Using cached jupyter-1.1.1-py2.py3-none-any.whl (2.7 kB)
Using cached ipykernel-6.29.5-py3-none-any.whl (117 kB)
Using cached comm-0.2.2-py3-none-any.whl (7.2 kB)
Using cached debugpy-1.8.14-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB)
Using cached ipython-8.37.0-py3-none-any.whl (831 kB)
Using cached prompt_toolkit-3.0.51-py3-none-any.whl (387 kB)
Using cached jedi-0.19.2-py2.py3-none-any.whl (1.6 MB)
Using cached parso-0.8.4-py2.py3-none-any.whl (103 kB)
Using cached jupyter_client-8.6.3-py3-none-any.whl (106 kB)
Using cached jupyter_core-5.8.1-py3-none-any.whl (28 kB)
Using cached matplotlib_inline-0.1.7-py3-none-any.whl (9.9 kB)
Using cached pexpect-4.9.0-py2.py3-none-any.whl (63 kB)
Using cached platformdirs-4.3.8-py3-none-any.whl (18 kB)
Using cached ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB)
Using cached pygments-2.19.2-py3-none-any.whl (1.2 MB)
Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB)
Using cached pyzmq-27.0.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (853 kB)
Using cached six-1.17.0-py2.py3-none-any.whl (11 kB)
Using cached tornado-6.5.1-cp39-abi3-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (443 kB)
Using cached traitlets-5.14.3-py3-none-any.whl (85 kB)
Using cached typing_extensions-4.14.1-py3-none-any.whl (43 kB)
Using cached decorator-5.2.1-py3-none-any.whl (9.2 kB)
Using cached exceptiongroup-1.3.0-py3-none-any.whl (16 kB)
Using cached ipywidgets-8.1.7-py3-none-any.whl (139 kB)
Using cached jupyterlab_widgets-3.0.15-py3-none-any.whl (216 kB)
Using cached widgetsnbextension-4.0.14-py3-none-any.whl (2.2 MB)
Using cached jupyter_console-6.6.3-py3-none-any.whl (24 kB)
Using cached jupyterlab-4.4.4-py3-none-any.whl (12.3 MB)
Using cached jupyter_server-2.16.0-py3-none-any.whl (386 kB)
Using cached jupyterlab_server-2.27.3-py3-none-any.whl (59 kB)
Using cached anyio-4.9.0-py3-none-any.whl (100 kB)
Using cached argon2_cffi-25.1.0-py3-none-any.whl (14 kB)
Using cached async_lru-2.0.5-py3-none-any.whl (6.1 kB)
Using cached babel-2.17.0-py3-none-any.whl (10.2 MB)
Using cached httpx-0.28.1-py3-none-any.whl (73 kB)
Using cached httpcore-1.0.9-py3-none-any.whl (78 kB)
Using cached h11-0.16.0-py3-none-any.whl (37 kB)
Using cached idna-3.10-py3-none-any.whl (70 kB)
Using cached jinja2-3.1.6-py3-none-any.whl (134 kB)
Using cached json5-0.12.0-py3-none-any.whl (36 kB)
Using cached jsonschema-4.24.0-py3-none-any.whl (88 kB)
Using cached attrs-25.3.0-py3-none-any.whl (63 kB)
Using cached jsonschema_specifications-2025.4.1-py3-none-any.whl (18 kB)
Using cached jupyter_events-0.12.0-py3-none-any.whl (19 kB)
Using cached jsonpointer-3.0.0-py2.py3-none-any.whl (7.6 kB)
Using cached jupyter_lsp-2.2.5-py3-none-any.whl (69 kB)
Using cached jupyter_server_terminals-0.5.3-py3-none-any.whl (13 kB)
Using cached MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (20 kB)
Using cached nbconvert-7.16.6-py3-none-any.whl (258 kB)
Using cached mistune-3.1.3-py3-none-any.whl (53 kB)
Using cached bleach-6.2.0-py3-none-any.whl (163 kB)
Using cached tinycss2-1.4.0-py3-none-any.whl (26 kB)
Using cached nbclient-0.10.2-py3-none-any.whl (25 kB)
Using cached nbformat-5.10.4-py3-none-any.whl (78 kB)
Using cached fastjsonschema-2.21.1-py3-none-any.whl (23 kB)
Using cached notebook_shim-0.2.4-py3-none-any.whl (13 kB)
Using cached overrides-7.7.0-py3-none-any.whl (17 kB)
Using cached packaging-25.0-py3-none-any.whl (66 kB)
Using cached pandocfilters-1.5.1-py2.py3-none-any.whl (8.7 kB)
Using cached prometheus_client-0.22.1-py3-none-any.whl (58 kB)
Using cached python_json_logger-3.3.0-py3-none-any.whl (15 kB)
Using cached PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (751 kB)
Using cached referencing-0.36.2-py3-none-any.whl (26 kB)
Using cached requests-2.32.4-py3-none-any.whl (64 kB)
Using cached charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (149 kB)
Using cached urllib3-2.5.0-py3-none-any.whl (129 kB)
Using cached certifi-2025.7.9-py3-none-any.whl (159 kB)
Using cached rfc3986_validator-0.1.1-py2.py3-none-any.whl (4.2 kB)
Using cached rpds_py-0.26.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (383 kB)
Using cached Send2Trash-1.8.3-py3-none-any.whl (18 kB)
Using cached sniffio-1.3.1-py3-none-any.whl (10 kB)
Using cached terminado-0.18.1-py3-none-any.whl (14 kB)
Using cached tomli-2.2.1-py3-none-any.whl (14 kB)
Using cached webcolors-24.11.1-py3-none-any.whl (14 kB)
Using cached webencodings-0.5.1-py2.py3-none-any.whl (11 kB)
Using cached websocket_client-1.8.0-py3-none-any.whl (58 kB)
Using cached argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (86 kB)
Using cached cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (446 kB)
Using cached beautifulsoup4-4.13.4-py3-none-any.whl (187 kB)
Using cached soupsieve-2.7-py3-none-any.whl (36 kB)
Using cached defusedxml-0.7.1-py2.py3-none-any.whl (25 kB)
Using cached fqdn-1.5.1-py3-none-any.whl (9.1 kB)
Using cached isoduration-20.11.0-py3-none-any.whl (11 kB)
Using cached arrow-1.3.0-py3-none-any.whl (66 kB)
Using cached types_python_dateutil-2.9.0.20250708-py3-none-any.whl (17 kB)
Using cached jupyterlab_pygments-0.3.0-py3-none-any.whl (15 kB)
Using cached nest_asyncio-1.6.0-py3-none-any.whl (5.2 kB)
Using cached notebook-7.4.4-py3-none-any.whl (14.3 MB)
Using cached psutil-7.0.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (277 kB)
Using cached pycparser-2.22-py3-none-any.whl (117 kB)
Using cached rfc3339_validator-0.1.4-py2.py3-none-any.whl (3.5 kB)
Using cached stack_data-0.6.3-py3-none-any.whl (24 kB)
Using cached asttokens-3.0.0-py3-none-any.whl (26 kB)
Using cached executing-2.2.0-py2.py3-none-any.whl (26 kB)
Using cached pure_eval-0.2.3-py3-none-any.whl (11 kB)
Using cached uri_template-1.3.0-py3-none-any.whl (11 kB)
Using cached wcwidth-0.2.13-py2.py3-none-any.whl (34 kB)
Installing collected packages: webencodings, wcwidth, pure-eval, ptyprocess, fastjsonschema, widgetsnbextension, websocket-client, webcolors, urllib3, uri-template, typing_extensions, types-python-dateutil, traitlets, tornado, tomli, tinycss2, soupsieve, sniffio, six, send2trash, rpds-py, rfc3986-validator, pyzmq, pyyaml, python-json-logger, pygments, pycparser, psutil, prompt_toolkit, prometheus-client, platformdirs, pexpect, parso, pandocfilters, packaging, overrides, nest-asyncio, MarkupSafe, jupyterlab_widgets, jupyterlab-pygments, jsonpointer, json5, idna, h11, fqdn, executing, defusedxml, decorator, debugpy, charset_normalizer, certifi, bleach, babel, attrs, asttokens, terminado, stack_data, rfc3339-validator, requests, referencing, python-dateutil, mistune, matplotlib-inline, jupyter-core, jinja2, jedi, httpcore, exceptiongroup, comm, cffi, beautifulsoup4, async-lru, jupyter-server-terminals, jupyter-client, jsonschema-specifications, ipython, arrow, argon2-cffi-bindings, anyio, jsonschema, isoduration, ipywidgets, ipykernel, httpx, argon2-cffi, nbformat, jupyter-console, nbclient, jupyter-events, nbconvert, jupyter-server, notebook-shim, jupyterlab-server, jupyter-lsp, jupyterlab, notebook, jupyter
Successfully installed MarkupSafe-3.0.2 anyio-4.9.0 argon2-cffi-25.1.0 argon2-cffi-bindings-21.2.0 arrow-1.3.0 asttokens-3.0.0 async-lru-2.0.5 attrs-25.3.0 babel-2.17.0 beautifulsoup4-4.13.4 bleach-6.2.0 certifi-2025.7.9 cffi-1.17.1 charset_normalizer-3.4.2 comm-0.2.2 debugpy-1.8.14 decorator-5.2.1 defusedxml-0.7.1 exceptiongroup-1.3.0 executing-2.2.0 fastjsonschema-2.21.1 fqdn-1.5.1 h11-0.16.0 httpcore-1.0.9 httpx-0.28.1 idna-3.10 ipykernel-6.29.5 ipython-8.37.0 ipywidgets-8.1.7 isoduration-20.11.0 jedi-0.19.2 jinja2-3.1.6 json5-0.12.0 jsonpointer-3.0.0 jsonschema-4.24.0 jsonschema-specifications-2025.4.1 jupyter-1.1.1 jupyter-client-8.6.3 jupyter-console-6.6.3 jupyter-core-5.8.1 jupyter-events-0.12.0 jupyter-lsp-2.2.5 jupyter-server-2.16.0 jupyter-server-terminals-0.5.3 jupyterlab-4.4.4 jupyterlab-pygments-0.3.0 jupyterlab-server-2.27.3 jupyterlab_widgets-3.0.15 matplotlib-inline-0.1.7 mistune-3.1.3 nbclient-0.10.2 nbconvert-7.16.6 nbformat-5.10.4 nest-asyncio-1.6.0 notebook-7.4.4 notebook-shim-0.2.4 overrides-7.7.0 packaging-25.0 pandocfilters-1.5.1 parso-0.8.4 pexpect-4.9.0 platformdirs-4.3.8 prometheus-client-0.22.1 prompt_toolkit-3.0.51 psutil-7.0.0 ptyprocess-0.7.0 pure-eval-0.2.3 pycparser-2.22 pygments-2.19.2 python-dateutil-2.9.0.post0 python-json-logger-3.3.0 pyyaml-6.0.2 pyzmq-27.0.0 referencing-0.36.2 requests-2.32.4 rfc3339-validator-0.1.4 rfc3986-validator-0.1.1 rpds-py-0.26.0 send2trash-1.8.3 six-1.17.0 sniffio-1.3.1 soupsieve-2.7 stack_data-0.6.3 terminado-0.18.1 tinycss2-1.4.0 tomli-2.2.1 tornado-6.5.1 traitlets-5.14.3 types-python-dateutil-2.9.0.20250708 typing_extensions-4.14.1 uri-template-1.3.0 urllib3-2.5.0 wcwidth-0.2.13 webcolors-24.11.1 webencodings-0.5.1 websocket-client-1.8.0 widgetsnbextension-4.0.14
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.
INFO: Installed jupyter
jupyter_core     : 5.8.1
INFO: TAO Toolkit was not installed.
INFO: Installing TAO Toolkit.
Collecting nvidia-tao
  Using cached nvidia_tao-6.0.0-py3-none-any.whl.metadata (8.1 kB)
Requirement already satisfied: certifi>=2022.12.07 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (2025.7.9)
Collecting chardet==3.0.4 (from nvidia-tao)
  Using cached chardet-3.0.4-py2.py3-none-any.whl.metadata (3.2 kB)
Collecting docker-pycreds==0.4.0 (from nvidia-tao)
  Using cached docker_pycreds-0.4.0-py2.py3-none-any.whl.metadata (1.8 kB)
Collecting docker==4.3.1 (from nvidia-tao)
  Using cached docker-4.3.1-py2.py3-none-any.whl.metadata (3.7 kB)
Collecting idna==2.10 (from nvidia-tao)
  Using cached idna-2.10-py2.py3-none-any.whl.metadata (9.1 kB)
Collecting requests==2.31.0 (from nvidia-tao)
  Using cached requests-2.31.0-py3-none-any.whl.metadata (4.6 kB)
Collecting rich<14.0,>=13.6.0 (from nvidia-tao)
  Using cached rich-13.9.4-py3-none-any.whl.metadata (18 kB)
Requirement already satisfied: six<2.0.0,>=1.15.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (1.17.0)
Collecting tabulate<1.0,>=0.9.0 (from nvidia-tao)
  Using cached tabulate-0.9.0-py3-none-any.whl.metadata (34 kB)
Collecting urllib3<2.0.0,>=1.26.15 (from nvidia-tao)
  Using cached urllib3-1.26.20-py2.py3-none-any.whl.metadata (50 kB)
Collecting websocket-client==0.57.0 (from nvidia-tao)
  Using cached websocket_client-0.57.0-py2.py3-none-any.whl.metadata (7.5 kB)
Requirement already satisfied: charset-normalizer<4,>=2 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from requests==2.31.0->nvidia-tao) (3.4.2)
Collecting markdown-it-py>=2.2.0 (from rich<14.0,>=13.6.0->nvidia-tao)
  Using cached markdown_it_py-3.0.0-py3-none-any.whl.metadata (6.9 kB)
Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from rich<14.0,>=13.6.0->nvidia-tao) (2.19.2)
Requirement already satisfied: typing-extensions<5.0,>=4.0.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from rich<14.0,>=13.6.0->nvidia-tao) (4.14.1)
Collecting mdurl~=0.1 (from markdown-it-py>=2.2.0->rich<14.0,>=13.6.0->nvidia-tao)
  Using cached mdurl-0.1.2-py3-none-any.whl.metadata (1.6 kB)
Using cached nvidia_tao-6.0.0-py3-none-any.whl (36 kB)
Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Using cached docker-4.3.1-py2.py3-none-any.whl (145 kB)
Using cached docker_pycreds-0.4.0-py2.py3-none-any.whl (9.0 kB)
Using cached idna-2.10-py2.py3-none-any.whl (58 kB)
Using cached requests-2.31.0-py3-none-any.whl (62 kB)
Using cached websocket_client-0.57.0-py2.py3-none-any.whl (200 kB)
Using cached rich-13.9.4-py3-none-any.whl (242 kB)
Using cached tabulate-0.9.0-py3-none-any.whl (35 kB)
Using cached urllib3-1.26.20-py2.py3-none-any.whl (144 kB)
Using cached markdown_it_py-3.0.0-py3-none-any.whl (87 kB)
Using cached mdurl-0.1.2-py3-none-any.whl (10.0 kB)
Installing collected packages: chardet, websocket-client, urllib3, tabulate, mdurl, idna, docker-pycreds, requests, markdown-it-py, rich, docker, nvidia-tao
  Attempting uninstall: websocket-client
    Found existing installation: websocket-client 1.8.0
    Uninstalling websocket-client-1.8.0:
      Successfully uninstalled websocket-client-1.8.0
  Attempting uninstall: urllib3
    Found existing installation: urllib3 2.5.0
    Uninstalling urllib3-2.5.0:
      Successfully uninstalled urllib3-2.5.0
  Attempting uninstall: idna
    Found existing installation: idna 3.10
    Uninstalling idna-3.10:
      Successfully uninstalled idna-3.10
  Attempting uninstall: requests
    Found existing installation: requests 2.32.4
    Uninstalling requests-2.32.4:
      Successfully uninstalled requests-2.32.4
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
jupyter-server 2.16.0 requires websocket-client>=1.7, but you have websocket-client 0.57.0 which is incompatible.
Successfully installed chardet-3.0.4 docker-4.3.1 docker-pycreds-0.4.0 idna-2.10 markdown-it-py-3.0.0 mdurl-0.1.2 nvidia-tao-6.0.0 requests-2.31.0 rich-13.9.4 tabulate-0.9.0 urllib3-1.26.20 websocket-client-0.57.0
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.
INFO: Installed TAO wheel version: python
INFO: ~/.tao_mounts.json wasn't found. Falling back to obtain mount points and docker configs from ~/.tao_mounts.json.
Please note that this will be deprecated going forward.
Configuration of the TAO Toolkit Instance

task_group:
    model:
        dockers:
            nvidia/tao/tao-toolkit:
                6.0.0-tf2:
                    docker_registry: nvcr.io
                    tasks:
                        1. classification_tf2
                        2. efficientdet_tf2
                6.0.0-pyt:
                    docker_registry: nvcr.io
                    tasks:
                        1. action_recognition
                        2. centerpose
                        3. classification_pyt
                        4. deformable_detr
                        5. dino
                        6. grounding_dino
                        7. mask_grounding_dino
                        8. mask2former
                        9. mal
                        10. mae
                        11. ml_recog
                        12. nvdinov2
                        13. ocdnet
                        14. ocrnet
                        15. optical_inspection
                        16. pointpillars
                        17. pose_classification
                        18. re_identification
                        19. rtdetr
                        20. segformer
                        21. stylegan_xl
                        22. visual_changenet
                5.5.0-pyt:
                    docker_registry: nvcr.io
                    tasks:
                        1. bevfusion
                5.0.0-tf1.15.5:
                    docker_registry: nvcr.io
                    tasks:
                        1. bpnet
                        2. classification_tf1
                        3. converter
                        4. detectnet_v2
                        5. dssd
                        6. efficientdet_tf1
                        7. faster_rcnn
                        8. fpenet
                        9. lprnet
                        10. mask_rcnn
                        11. multitask_classification
                        12. retinanet
                        13. ssd
                        14. unet
                        15. yolo_v3
                        16. yolo_v4
                        17. yolo_v4_tiny
    dataset:
        dockers:
            nvidia/tao/tao-toolkit:
                6.0.0-data-services:
                    docker_registry: nvcr.io
                    tasks:
                        1. augmentation
                        2. auto_label
                        3. annotations
                        4. analytics
    deploy:
        dockers:
            nvidia/tao/tao-toolkit:
                6.0.0-deploy:
                    docker_registry: nvcr.io
                    tasks:
                        1. centerpose
                        2. classification_pyt
                        3. classification_tf1
                        4. classification_tf2
                        5. deformable_detr
                        6. detectnet_v2
                        7. dino
                        8. dssd
                        9. efficientdet_tf1
                        10. efficientdet_tf2
                        11. faster_rcnn
                        12. grounding_dino
                        13. lprnet
                        14. mask2former
                        15. mask_grounding_dino
                        16. mask_rcnn
                        17. mae
                        18. model_agnostic
                        19. ml_recog
                        20. multitask_classification
                        21. ocdnet
                        22. ocrnet
                        23. optical_inspection
                        24. retinanet
                        25. rtdetr
                        26. segformer
                        27. ssd
                        28. trtexec
                        29. unet
                        30. visual_changenet
                        31. yolo_v3
                        32. yolo_v4
                        33. yolo_v4_tiny
format_version: 3.0
toolkit_version: 6.0.0
published_date: 07/11/2025

Then I ran “docker run --runtime=nvidia -it --rm nvcr.io/nvidia/tao/tao-toolkit:5.5.0-pyt /bin/bash” and got as output:

Unable to find image 'nvcr.io/nvidia/tao/tao-toolkit:5.5.0-pyt' locally
5.5.0-pyt: Pulling from nvidia/tao/tao-toolkit

Status: Downloaded newer image for nvcr.io/nvidia/tao/tao-toolkit:5.5.0-pyt

===========================
=== TAO Toolkit PyTorch ===
===========================

NVIDIA Release 5.5.0-PyT (build 88113656)
TAO Toolkit Version 5.5.0

Various files include modifications (c) NVIDIA CORPORATION & AFFILIATES.  All rights reserved.

This container image and its contents are governed by the TAO Toolkit End User License Agreement.
By pulling and using the container, you accept the terms and conditions of this license:
https://developer.nvidia.com/tao-toolkit-software-license-agreement

ERROR: The NVIDIA Driver is present, but CUDA failed to initialize.  GPU functionality will not be available.
   [[ (null) (error 304) ]]

Failed to detect NVIDIA driver version.

NOTE: The SHMEM allocation limit is set to the default of 64MB.  This may be
   insufficient for TAO Toolkit.  NVIDIA recommends the use of the following flags:
   docker run --gpus all --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 ...

Hi @contactmarcelpatrick I moved your topic to TAO forum for better support

This error is typically associated with OS-level issues preventing CUDA from initializing, such as missing drivers, incompatible versions, or improper Docker/NVIDIA integration.

Please install the nvidia-docker2. Refer to Error setting up TAO Toolkit - 'nvidia-docker not found' - #8 by Morganh.

I followed this post as suggested on the previous message: Error setting up TAO Toolkit - 'nvidia-docker not found' - #7 by abhilashcashok

I ran:

~# curl -s -L https://nvidia.github.io/nvidia-docker/gpgkey |
sudo apt-key add -
distribution=$(. /etc/os-release;echo $ID$VERSION_ID)
curl -s -L https://nvidia.github.io/nvidia-docker/$distribution/nvidia-docker.list |
sudo tee /etc/apt/sources.list.d/nvidia-docker.list
sudo apt-get update
sudo apt-get install -y nvidia-docker2
sudo pkill -SIGHUP dockerd

Got as output:

Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)).
OK
deb https://nvidia.github.io/libnvidia-container/stable/ubuntu18.04/$(ARCH) /
#deb https://nvidia.github.io/libnvidia-container/experimental/ubuntu18.04/$(ARCH) /
deb https://nvidia.github.io/nvidia-container-runtime/stable/ubuntu18.04/$(ARCH) /
#deb https://nvidia.github.io/nvidia-container-runtime/experimental/ubuntu18.04/$(ARCH) /
deb https://nvidia.github.io/nvidia-docker/ubuntu18.04/$(ARCH) /
Get:1 http://packages.ros.org/ros2/ubuntu jammy InRelease [4,682 B]
Hit:2 https://nvidia.github.io/libnvidia-container/stable/deb/amd64  InRelease
Get:3 https://nvidia.github.io/libnvidia-container/stable/ubuntu18.04/amd64  InRelease [1,484 B]
Get:4 https://nvidia.github.io/nvidia-container-runtime/stable/ubuntu18.04/amd64  InRelease [1,481 B]
Hit:5 https://download.docker.com/linux/ubuntu jammy InRelease
Get:6 https://nvidia.github.io/nvidia-docker/ubuntu18.04/amd64  InRelease [1,474 B]
Get:7 https://nvidia.github.io/libnvidia-container/stable/ubuntu18.04/amd64  Packages [29.2 kB]
Get:8 https://nvidia.github.io/nvidia-container-runtime/stable/ubuntu18.04/amd64  Packages [7,416 B]
Get:9 https://nvidia.github.io/nvidia-docker/ubuntu18.04/amd64  Packages [4,488 B]
Ign:10 https://download.opensuse.org/repositories/devel:/kubic:/libcontainers:/stable/22.04  InRelease
Ign:11 https://download.opensuse.org/repositories/devel:/kubic:/libcontainers:/stable:/cri-o:/1.27/22.04  InRelease
Get:12 http://security.ubuntu.com/ubuntu jammy-security InRelease [129 kB]
Err:13 https://download.opensuse.org/repositories/devel:/kubic:/libcontainers:/stable/22.04  Release
  404  Not Found [IP: 195.135.223.226 443]
Err:14 https://download.opensuse.org/repositories/devel:/kubic:/libcontainers:/stable:/cri-o:/1.27/22.04  Release
  404  Not Found [IP: 195.135.223.226 443]
Get:15 http://security.ubuntu.com/ubuntu jammy-security/main amd64 Packages [2,474 kB]
Get:16 http://security.ubuntu.com/ubuntu jammy-security/main Translation-en [370 kB]
Get:17 http://security.ubuntu.com/ubuntu jammy-security/universe amd64 Packages [990 kB]
Hit:18 http://archive.ubuntu.com/ubuntu jammy InRelease
Get:19 http://archive.ubuntu.com/ubuntu jammy-updates InRelease [128 kB]
Get:20 http://archive.ubuntu.com/ubuntu jammy-backports InRelease [127 kB]
Get:21 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 Packages [2,751 kB]
Get:22 http://archive.ubuntu.com/ubuntu jammy-updates/main Translation-en [437 kB]
Get:23 http://archive.ubuntu.com/ubuntu jammy-updates/restricted amd64 Packages [4,018 kB]
Get:24 http://archive.ubuntu.com/ubuntu jammy-updates/restricted Translation-en [725 kB]
Get:25 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 Packages [1,223 kB]
Get:26 http://archive.ubuntu.com/ubuntu jammy-updates/universe Translation-en [302 kB]
Get:27 http://archive.ubuntu.com/ubuntu jammy-updates/multiverse amd64 Packages [59.5 kB]
Get:28 http://archive.ubuntu.com/ubuntu jammy-updates/multiverse Translation-en [14.2 kB]
Reading package lists... Done
W: https://nvidia.github.io/libnvidia-container/stable/ubuntu18.04/amd64/InRelease: Key is stored in legacy trusted.gpg keyring (/etc/apt/trusted.gpg), see the DEPRECATION section in apt-key(8) for details.
W: https://nvidia.github.io/nvidia-container-runtime/stable/ubuntu18.04/amd64/InRelease: Key is stored in legacy trusted.gpg keyring (/etc/apt/trusted.gpg), see the DEPRECATION section in apt-key(8) for details.
W: https://nvidia.github.io/nvidia-docker/ubuntu18.04/amd64/InRelease: Key is stored in legacy trusted.gpg keyring (/etc/apt/trusted.gpg), see the DEPRECATION section in apt-key(8) for details.
E: The repository 'https://download.opensuse.org/repositories/devel:/kubic:/libcontainers:/stable/22.04  Release' does not have a Release file.
N: Updating from such a repository can't be done securely, and is therefore disabled by default.
N: See apt-secure(8) manpage for repository creation and user configuration details.
E: The repository 'https://download.opensuse.org/repositories/devel:/kubic:/libcontainers:/stable:/cri-o:/1.27/22.04  Release' does not have a Release file.
N: Updating from such a repository can't be done securely, and is therefore disabled by default.
N: See apt-secure(8) manpage for repository creation and user configuration details.
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
The following NEW packages will be installed:
  nvidia-docker2
0 upgraded, 1 newly installed, 0 to remove and 10 not upgraded.
Need to get 5,128 B of archives.
After this operation, 21.5 kB of additional disk space will be used.
Get:1 https://nvidia.github.io/libnvidia-container/stable/deb/amd64  nvidia-docker2 2.14.0-1 [5,128 B]
Fetched 5,128 B in 0s (36.9 kB/s)
Selecting previously unselected package nvidia-docker2.
(Reading database ... 132806 files and directories currently installed.)
Preparing to unpack .../nvidia-docker2_2.14.0-1_all.deb ...
Unpacking nvidia-docker2 (2.14.0-1) ...
Setting up nvidia-docker2 (2.14.0-1) ...

Then I logged into my docker container following these steps:
" 1. Get an NGC account and API key.

  1. Go to NGC and click the TAO container in the Catalog tab. NGC displays the message “Sign in to access the PULL feature of this repository.”
  2. Enter your email address and click Next, or click Create an Account.
  3. Choose your organization when prompted for Organization/Team.
  4. Click Sign In.
  5. Log in to the NGC Docker registry (nvcr.io) using the command:

docker login nvcr.io

Then enter these credentials:

a. Username: “$oauthtoken” b. Password: “YOUR_NGC_API_KEY”
Where YOUR_NGC_API_KEY represents the key you generated in step 3."

Got as output: “Login Succeeded”

Then I ran: “bash setup/quickstart_launcher.sh --install”

And still got the error: “ERROR: nvidia-docker not found.”

Here is the full output:

INFO: Check requirements
INFO: Checking Python installation
INFO: python3 found.
INFO: Python version: Python 3.13.5
INFO: pip3 found.
INFO: Pip version: pip 25.1 from /root/miniconda3/lib/python3.13/site-packages/pip (python 3.13)
INFO: Docker found. Checking additional requirements for docker.
INFO: Checking nvidia-docker2 installation
ERROR: nvidia-docker not found.
INFO: NGC CLI found.
INFO: NGC CLI 3.160.1
INFO: Requirements check satisfied. Installing TAO Toolkit.
By installing the TAO Toolkit CLI, you accept the terms and conditions of this license: https://developer.nvidia.com/tao-toolkit-software-license-agreement
Would you like to continue? (y/n): y
INFO: EULA accepted.
INFO: Installing TAO Toolkit CLI
INFO: jupyter installation was found.
Selected Jupyter core packages...
IPython          : 9.4.0
ipykernel        : not installed
ipywidgets       : 8.1.7
jupyter_client   : not installed
jupyter_core     : 5.8.1
jupyter_server   : 2.16.0
jupyterlab       : not installed
nbclient         : not installed
nbconvert        : 7.16.6
nbformat         : 5.10.4
notebook         : 7.4.4
qtconsole        : not installed
traitlets        : 5.14.3
INFO: TAO Toolkit was found
Traceback (most recent call last):
  File "/root/miniconda3/bin/tao", line 5, in <module>
    from nvidia_tao_cli.entrypoint.tao_launcher import main
  File "/root/miniconda3/lib/python3.13/site-packages/nvidia_tao_cli/entrypoint/tao_launcher.py", line 23, in <module>
    from nvidia_tao_cli.components.instance_handler.builder import get_launcher
  File "/root/miniconda3/lib/python3.13/site-packages/nvidia_tao_cli/components/instance_handler/builder.py", line 24, in <module>
    from nvidia_tao_cli.components.instance_handler.local_instance import LocalInstance
  File "/root/miniconda3/lib/python3.13/site-packages/nvidia_tao_cli/components/instance_handler/local_instance.py", line 29, in <module>
    from nvidia_tao_cli.components.docker_handler.docker_handler import (
    ...<2 lines>...
    )
  File "/root/miniconda3/lib/python3.13/site-packages/nvidia_tao_cli/components/docker_handler/docker_handler.py", line 29, in <module>
    import docker
  File "/root/miniconda3/lib/python3.13/site-packages/docker/__init__.py", line 2, in <module>
    from .api import APIClient
  File "/root/miniconda3/lib/python3.13/site-packages/docker/api/__init__.py", line 2, in <module>
    from .client import APIClient
  File "/root/miniconda3/lib/python3.13/site-packages/docker/api/client.py", line 8, in <module>
    import websocket
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/__init__.py", line 23, in <module>
    from ._app import WebSocketApp
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_app.py", line 36, in <module>
    from ._core import WebSocket, getdefaulttimeout
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_core.py", line 34, in <module>
    from ._handshake import *
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_handshake.py", line 30, in <module>
    from ._http import *
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_http.py", line 33, in <module>
    from ._url import *
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_url.py", line 27, in <module>
    from six.moves.urllib.parse import urlparse
ModuleNotFoundError: No module named 'six.moves'
INFO:

Again, I’m running inside an Ubuntu 22.04.5 LTS (GNU/Linux 6.6.87.2-microsoft-standard-WSL2 x86_64) environment from my Windows 11 machine.

And by the way, I followed the thread with the exact same issues (Error setting up TAO Toolkit - 'nvidia-docker not found') and this thread too seems to be still open (not solved)

As mentioned in Error setting up TAO Toolkit - 'nvidia-docker not found' - #22 by Morganh, I can install tao launcher successfully in WSL ubuntu 22.04.
Step:

  1. I install fresh ubuntu22.04 in WSL under Windows11.
    Refer to Install Ubuntu on WSL: Step-by-Step Guide.
  2. Also install pip3 and unzip. You can refer to the log.
  3. Also add two lines as below in setup/quickstart_launcher.sh to fix “jupyter not found” which is a corner case in WSL.
       info "Installed jupyter"
       echo "export PATH=\"$(PATH):\$HOME/.local/bin\"" >> ~/.bashrc
       source ~/.bashrc
       jupyter --version | grep jupyter_core

Attach full log.
20250506_install_tao_launcher_in_wsl_ubuntu22_full.txt (374.8 KB)

Can you check my log to compare?

Here is a summary on where I stand. (I used Gemini to help me with that)

1. Initial Problem: Nvidia Docker not found

You were installing the NVIDIA TAO Toolkit and received an ERROR: nvidia-docker not found during the quickstart_launcher.sh installation script.


2. Troubleshooting Steps and Solutions

  • Issue 1: Docker GPU Access

    • Problem: The initial diagnosis was that Docker within your Ubuntu (WSL) environment couldn’t access the NVIDIA GPU.
    • Solution Attempt 1: We configured Docker’s daemon.json file. This was unsuccessful and led to a new error: Failed to initialize NVML: GPU access blocked by the operating system.
    • Solution Attempt 2: We updated the WSL kernel using wsl --update in PowerShell and rebooted. This was partially successful, as the nvidia-smi command now worked in Ubuntu, but the Docker test still failed.
  • Issue 2: Incorrect Docker Installation (The Root Cause)

    • Problem: While troubleshooting Docker settings, we discovered you did not have the Docker Desktop for Windows application installed. You had installed Docker directly inside Ubuntu, which cannot work with the Windows GPU driver in a WSL setup.
    • Solution: This was solved with the following steps:
      1. We completely uninstalled the Linux version of Docker from your Ubuntu terminal.
      2. You successfully installed the required Docker Desktop for Windows application.
      3. We configured Docker Desktop by enabling WSL Integration for your Ubuntu environment in Settings > Resources.

3. Final State

  • System Fixed: Your environment is now correctly configured. Your Windows NVIDIA driver, WSL2, and Docker Desktop are all working together properly.
  • Proof of Success: We confirmed the fix by running the test command docker run --rm --gpus all nvcr.io/nvidia/cuda:12.1.1-base-ubuntu22.04 nvidia-smi, which successfully displayed your GPU information from inside a Docker container.
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 560.31.01              Driver Version: 560.81         CUDA Version: 12.6     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 4090 ...    On  |   00000000:01:00.0  On |                  N/A |
| N/A   61C    P8             14W /  150W |    1004MiB /  16376MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+

+-----------------------------------------------------------------------------------------+
| Processes:                                                                              |
|  GPU   GI   CI        PID   Type   Process name                              GPU Memory |
|        ID   ID                                                               Usage      |
|=========================================================================================|
|  No running processes found                                                             |
+-----------------------------------------------------------------------------------------+
  • Current Status: You are now able to run the TAO Toolkit installation script (bash setup/quickstart_launcher.sh --install). The misleading ERROR: nvidia-docker not found still appears but can be safely ignored, as the underlying system is fully functional.

Trying the exercise to generate synthetic data:

  • With that done I tryied to run the exemple provided in this Nvidia tutoria: Course | NVIDIA
  • I was able to run the scrip “./generate_data.sh”
  • It generated the image files of synthetic data “palletjack_data” saved on my local machine.
  • HOWEVER, When I run “bash quickstart_launcher.sh --install” or “bash quickstart_launcher.sh --upgrade” I still get : “ERROR: nvidia-docker not found.”:
INFO: Checking Python installation
INFO: python3 found.
INFO: Python version: Python 3.13.5
INFO: pip3 found.
INFO: Pip version: pip 25.1 from /root/miniconda3/lib/python3.13/site-packages/pip (python 3.13)
INFO: Docker found. Checking additional requirements for docker.
INFO: Checking nvidia-docker2 installation
ERROR: nvidia-docker not found.
INFO: NGC CLI found.
INFO: NGC CLI 3.160.1
INFO: Requirements check satisfied. Installing TAO Toolkit.
By installing the TAO Toolkit CLI, you accept the terms and conditions of this license: https://developer.nvidia.com/tao-toolkit-software-license-agreement
Would you like to continue? (y/n): y
INFO: EULA accepted.
INFO: Installing TAO Toolkit CLI
INFO: jupyter installation was found.
Selected Jupyter core packages...
IPython          : 9.4.0
ipykernel        : not installed
ipywidgets       : 8.1.7
jupyter_client   : not installed
jupyter_core     : 5.8.1
jupyter_server   : 2.16.0
jupyterlab       : not installed
nbclient         : not installed
nbconvert        : 7.16.6
nbformat         : 5.10.4
notebook         : 7.4.4
qtconsole        : not installed
traitlets        : 5.14.3
INFO: TAO Toolkit was found
Traceback (most recent call last):
  File "/root/miniconda3/bin/tao", line 5, in <module>
    from nvidia_tao_cli.entrypoint.tao_launcher import main
  File "/root/miniconda3/lib/python3.13/site-packages/nvidia_tao_cli/entrypoint/tao_launcher.py", line 23, in <module>
    from nvidia_tao_cli.components.instance_handler.builder import get_launcher
  File "/root/miniconda3/lib/python3.13/site-packages/nvidia_tao_cli/components/instance_handler/builder.py", line 24, in <module>
    from nvidia_tao_cli.components.instance_handler.local_instance import LocalInstance
  File "/root/miniconda3/lib/python3.13/site-packages/nvidia_tao_cli/components/instance_handler/local_instance.py", line 29, in <module>
    from nvidia_tao_cli.components.docker_handler.docker_handler import (
    ...<2 lines>...
    )
  File "/root/miniconda3/lib/python3.13/site-packages/nvidia_tao_cli/components/docker_handler/docker_handler.py", line 29, in <module>
    import docker
  File "/root/miniconda3/lib/python3.13/site-packages/docker/__init__.py", line 2, in <module>
    from .api import APIClient
  File "/root/miniconda3/lib/python3.13/site-packages/docker/api/__init__.py", line 2, in <module>
    from .client import APIClient
  File "/root/miniconda3/lib/python3.13/site-packages/docker/api/client.py", line 8, in <module>
    import websocket
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/__init__.py", line 23, in <module>
    from ._app import WebSocketApp
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_app.py", line 36, in <module>
    from ._core import WebSocket, getdefaulttimeout
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_core.py", line 34, in <module>
    from ._handshake import *
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_handshake.py", line 30, in <module>
    from ._http import *
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_http.py", line 33, in <module>
    from ._url import *
  File "/root/miniconda3/lib/python3.13/site-packages/websocket/_url.py", line 27, in <module>
    from six.moves.urllib.parse import urlparse
ModuleNotFoundError: No module named 'six.moves'
INFO:
  • (I added the two lines suggested to the quickstart_launcher.sh file:
       echo "export PATH=\"$(PATH):\$HOME/.local/bin\"" >> ~/.bashrc
       source ~/.bashrc

Question:

  • Does that mean that the TAO installation and integration was done successfully?

  • Gemini tells me that “ERROR: nvidia-docker not found.” can be ignored because:
    " You can safely ignore this error. The installation script is outdated and is looking for a deprecated tool (nvidia-docker2). Your system is correctly configured with the modern nvidia-container-toolkit, which we proved when you successfully ran the docker run ... nvidia-smi test. The script is simply not smart enough to recognize the modern setup, but it doesn’t stop the installation."

  • Is this really the case? Can I train machine learning models on jupyter notebook using the generated synthetic data even with this error still present? (I’m referring to the tutorials on: Course | NVIDIA

Please make sure six is installed for your active Python environment:

pip install six
# or if using conda:
conda install six

For Linux: sudo apt-get install python3-six .
You can verify installation with:

python -m pip show six

More, you can try to set to python3.10 environment. I was using python 3.10 mentioned in https://forums.developer.nvidia.com/uploads/short-url/sU6rrw71U4qr9Phosk0DnwD2D8S.txt.

Currently, you are installing the tao-launcher.
Instead, you can also docker pull the tao docker(for example, nvcr.io/nvidia/tao/tao-toolkit:5.5.0-pyt) and run with docker run way.

    model:
        dockers:
            nvidia/tao/tao-toolkit:
                5.5.0-pyt:
                    docker_registry: nvcr.io
                    tasks:
                        1. action_recognition
                        2. centerpose
                        3. visual_changenet
                        4. deformable_detr
                        5. dino
                        6. grounding_dino
                        7. mask_grounding_dino
                        8. mask2former
                        9. mal
                        10. ml_recog
                        11. ocdnet
                        12. ocrnet
                        13. optical_inspection
                        14. pointpillars
                        15. pose_classification
                        16. re_identification
                        17. classification_pyt
                        18. segformer
                        19. bevfusion
                5.0.0-tf1.15.5:
                    docker_registry: nvcr.io
                    tasks:
                        1. bpnet
                        2. classification_tf1
                        3. converter
                        4. detectnet_v2
                        5. dssd
                        6. efficientdet_tf1
                        7. faster_rcnn
                        8. fpenet
                        9. lprnet
                        10. mask_rcnn
                        11. multitask_classification
                        12. retinanet
                        13. ssd
                        14. unet
                        15. yolo_v3
                        16. yolo_v4
                        17. yolo_v4_tiny
                5.5.0-tf2:
                    docker_registry: nvcr.io
                    tasks:
                        1. classification_tf2
                        2. efficientdet_tf2
    dataset:
        dockers:
            nvidia/tao/tao-toolkit:
                5.5.0-data-services:
                    docker_registry: nvcr.io
                    tasks:
                        1. augmentation
                        2. auto_label
                        3. annotations
                        4. analytics
    deploy:
        dockers:
            nvidia/tao/tao-toolkit:
                5.5.0-deploy:
                    docker_registry: nvcr.io
                    tasks:
                        1. visual_changenet
                        2. centerpose
                        3. classification_pyt
                        4. classification_tf1
                        5. classification_tf2
                        6. deformable_detr
                        7. detectnet_v2
                        8. dino
                        9. dssd
                        10. efficientdet_tf1
                        11. efficientdet_tf2
                        12. faster_rcnn
                        13. grounding_dino
                        14. mask_grounding_dino
                        15. mask2former
                        16. lprnet
                        17. mask_rcnn
                        18. ml_recog
                        19. multitask_classification
                        20. ocdnet
                        21. ocrnet
                        22. optical_inspection
                        23. retinanet
                        24. segformer
                        25. ssd
                        26. trtexec
                        27. unet
                        28. yolo_v3
                        29. yolo_v4
                        30. yolo_v4_tiny
format_version: 3.0
toolkit_version: 5.5.0
published_date: 08/26/2024

I installed six. Now after I run bash quickstart_launcher.sh --install or --upgrade I get:

INFO: Check requirements
INFO: Checking Python installation
INFO: python3 found.
INFO: Python version: Python 3.10.18
INFO: pip3 found.
INFO: Pip version: pip 25.1 from /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages/pip (python 3.10)
INFO: Docker found. Checking additional requirements for docker.
INFO: Checking nvidia-docker2 installation
ERROR: nvidia-docker not found.
INFO: NGC CLI found.
INFO: NGC CLI 3.160.1
INFO: Requirements check satisfied. Installing TAO Toolkit.
By installing the TAO Toolkit CLI, you accept the terms and conditions of this license: https://developer.nvidia.com/tao-toolkit-software-license-agreement
Would you like to continue? (y/n): y
INFO: EULA accepted.
INFO: Installing TAO Toolkit CLI
INFO: jupyter installation was found.
Selected Jupyter core packages...
IPython          : 8.37.0
ipykernel        : 6.29.5
ipywidgets       : 8.1.7
jupyter_client   : 8.6.3
jupyter_core     : 5.8.1
jupyter_server   : 2.16.0
jupyterlab       : 4.4.4
nbclient         : 0.10.2
nbconvert        : 7.16.6
nbformat         : 5.10.4
notebook         : 7.4.4
qtconsole        : not installed
traitlets        : 5.14.3
INFO: TAO Toolkit was found
INFO: Upgrading installed nvidia-tao to the latest version.
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: nvidia-tao in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (6.0.0)
Requirement already satisfied: certifi>=2022.12.07 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (2025.7.9)
Requirement already satisfied: chardet==3.0.4 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (3.0.4)
Requirement already satisfied: docker-pycreds==0.4.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (0.4.0)
Requirement already satisfied: docker==4.3.1 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (4.3.1)
Requirement already satisfied: idna==2.10 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (2.10)
Requirement already satisfied: requests==2.31.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (2.31.0)
Requirement already satisfied: rich<14.0,>=13.6.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (13.9.4)
Requirement already satisfied: six<2.0.0,>=1.15.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (1.17.0)
Requirement already satisfied: tabulate<1.0,>=0.9.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (0.9.0)
Requirement already satisfied: urllib3<2.0.0,>=1.26.15 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (1.26.20)
Requirement already satisfied: websocket-client==0.57.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from nvidia-tao) (0.57.0)
Requirement already satisfied: charset-normalizer<4,>=2 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from requests==2.31.0->nvidia-tao) (3.4.2)
Requirement already satisfied: markdown-it-py>=2.2.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from rich<14.0,>=13.6.0->nvidia-tao) (3.0.0)
Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from rich<14.0,>=13.6.0->nvidia-tao) (2.19.2)
Requirement already satisfied: typing-extensions<5.0,>=4.0.0 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from rich<14.0,>=13.6.0->nvidia-tao) (4.14.1)
Requirement already satisfied: mdurl~=0.1 in /root/miniconda3/envs/tao-py310/lib/python3.10/site-packages (from markdown-it-py>=2.2.0->rich<14.0,>=13.6.0->nvidia-tao) (0.1.2)
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.
INFO: Configuration of the TAO Toolkit Instance

task_group:
    model:
        dockers:
            nvidia/tao/tao-toolkit:
                6.0.0-tf2:
                    docker_registry: nvcr.io
                    tasks:
                        1. classification_tf2
                        2. efficientdet_tf2
                6.0.0-pyt:
                    docker_registry: nvcr.io
                    tasks:
                        1. action_recognition
                        2. centerpose
                        3. classification_pyt
                        4. deformable_detr
                        5. dino
                        6. grounding_dino
                        7. mask_grounding_dino
                        8. mask2former
                        9. mal
                        10. mae
                        11. ml_recog
                        12. nvdinov2
                        13. ocdnet
                        14. ocrnet
                        15. optical_inspection
                        16. pointpillars
                        17. pose_classification
                        18. re_identification
                        19. rtdetr
                        20. segformer
                        21. stylegan_xl
                        22. visual_changenet
                5.5.0-pyt:
                    docker_registry: nvcr.io
                    tasks:
                        1. bevfusion
                5.0.0-tf1.15.5:
                    docker_registry: nvcr.io
                    tasks:
                        1. bpnet
                        2. classification_tf1
                        3. converter
                        4. detectnet_v2
                        5. dssd
                        6. efficientdet_tf1
                        7. faster_rcnn
                        8. fpenet
                        9. lprnet
                        10. mask_rcnn
                        11. multitask_classification
                        12. retinanet
                        13. ssd
                        14. unet
                        15. yolo_v3
                        16. yolo_v4
                        17. yolo_v4_tiny
    dataset:
        dockers:
            nvidia/tao/tao-toolkit:
                6.0.0-data-services:
                    docker_registry: nvcr.io
                    tasks:
                        1. augmentation
                        2. auto_label
                        3. annotations
                        4. analytics
    deploy:
        dockers:
            nvidia/tao/tao-toolkit:
                6.0.0-deploy:
                    docker_registry: nvcr.io
                    tasks:
                        1. centerpose
                        2. classification_pyt
                        3. classification_tf1
                        4. classification_tf2
                        5. deformable_detr
                        6. detectnet_v2
                        7. dino
                        8. dssd
                        9. efficientdet_tf1
                        10. efficientdet_tf2
                        11. faster_rcnn
                        12. grounding_dino
                        13. lprnet
                        14. mask2former
                        15. mask_grounding_dino
                        16. mask_rcnn
                        17. mae
                        18. model_agnostic
                        19. ml_recog
                        20. multitask_classification
                        21. ocdnet
                        22. ocrnet
                        23. optical_inspection
                        24. retinanet
                        25. rtdetr
                        26. segformer
                        27. ssd
                        28. trtexec
                        29. unet
                        30. visual_changenet
                        31. yolo_v3
                        32. yolo_v4
                        33. yolo_v4_tiny
format_version: 3.0
toolkit_version: 6.0.0
published_date: 07/11/2025

The ERROR: “ERROR: nvidia-docker not found.” is still there.

How do I fix “ERROR: nvidia-docker not found.”?

I wonder if it has to do with the environment I’m running quickstart_launcher.sh from
I’m running it from: “([my conda env with python 3.10]) root@[mypc]:/mnt/c/Users/[myuser]/tao_tutorials/setup#
Or, do I need to run " docker run --runtime=nvidia -it --rm nvcr.io/nvidia/tao/tao-toolkit:5.5.0-pyt /bin/bash” and then download miniconda, python 3.10, quickstart_launcher.sh in it and run everythin from inside it?

I’m having a hard time piecing together what do I need to do inside each environment: my Ubuntu env vs the conda env vs the nvidia docker env etc etc

You can ignore this error. From the log, you already install the tao-launcher.

There are two separate ways to run. One is using tao-launcher. Another is using docker run. If you run with above command, it is not needed to install miniconda, python, quickstart_launcher.sh, etc. Just run everything inside the docker.

OK. Thanks for letting me know. It’d be helpful to remove this error message on the next update or make it a warning if not critical.

The local_train.ipynb instructions tell us to “Setup TAO via Docker container”. So I guess we must run TAO via docker for this exercise ? (Synthetic Data Generation for Perception Model Training in Isaac Sim > Fine-Tuning and Validating an AI Perception Model > Lecture: Training a Model With Synthetic Data)

Also, in local_train.ipynb, the pre requisite link “instructions” is broken.

This is not the official notebook from TAO. From TAO, its release is in GitHub - NVIDIA/tao_tutorials: Quick start scripts and tutorial notebooks to get started with TAO Toolkit. Thanks!