-
Notifications
You must be signed in to change notification settings - Fork 1.5k
transformer 4.22.0 causes integration issues #5150
Copy link
Copy link
Closed
Description
The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`.
0%| | 0/1 [00:00<?, ?it/s]
100%|██████████| 1/1 [00:00<00:00, 3.10it/s]
100%|██████████| 1/1 [00:00<00:00, 3.10it/s]
Moving 1 files to the new cache system 2,1
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "/opt/conda/lib/python3.8/site-packages/torch/cuda/__init__.py", line 217, in _lazy_init
torch._C._cuda_init()
RuntimeError: No CUDA GPUs are available
1.13.0a0+d321be6
0 of GPUs available
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/opt/conda/lib/python3.8/site-packages/torch/cuda/__init__.py", line 217, in _lazy_init
torch._C._cuda_init()
RuntimeError: No CUDA GPUs are available
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels