GPU Memory Usage Not Visible in nvidia-smi

Hi everyone,

I’m running NVIDIA Thor with Ubuntu 24.04.3, JetPack 7, and CUDA 13.0. When I run GPU-intensive Python applications (inference scripts), the nvidia-smi output shows the Python process under GPU processes, but the GPU memory column consistently displays 0 MiB, even though the GPU utilization percentage is showing activity.

System Info:

  • Device: NVIDIA Thor

  • OS: Ubuntu 24.04.3

  • JetPack: 7

  • CUDA: 13.0

Has anyone encountered this issue? Is this expected behavior for Thor, or is there a configuration step I’m missing? (see the attachment for reference)

Any guidance would be appreciated.

coherent memory means $free shows all you have, not like being partitioned on a PC

Hi,

The memory usage from your log shows “Not Supported”.
Since the nvidia-smi has limited usage on Jetson, please try tegrastats instead to monitor the system memory.

$ sudo tegrastats

Thanks.

I really hope this issue can be resolved in the upcoming Jetson Linux (JetPack) release. Many standalone AI applications rely on nvidia-smi for GPU information. A broken nvidia-smi would make porting to THOR quite challenging.

There is no update from you for a period, assuming this is not an issue anymore.
Hence, we are closing this topic. If need further support, please open a new one.
Thanks
~1105

Hi,

Would you mind sharing what kind of error you met?
nvidia-smi is expected to work, but with limited functionality compared to the desktop version.

Do you mean the non-supported function causes an issue when porting an application to Thor?
If so, could you share which function/command you need so we can discuss this internally?

Thanks.