runtime error

9uYXdzLmNvbSIsImlhdCI6MTcxNTM3ODk0MCwiZXhwIjoxNzE1NDY1MzQwLCJpc3MiOiJodHRwczovL2h1Z2dpbmdmYWNlLmNvIn0.tRLWAD_fdw-cqp5W8yFfXZ-YXz8oewdlFVH0_Bh41rI"), fragment: None }, source: hyper::Error(IncompleteMessage) }'), traceback: None } (NoPermits) The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/app.py", line 54, in <module> main(args) File "/home/user/app/demo_watermark.py", line 859, in main model, tokenizer, device = load_model(args) File "/home/user/app/demo_watermark.py", line 209, in load_model model = AutoModelForCausalLM.from_pretrained(args.model_name_or_path,torch_dtype=torch.bfloat16, device_map='auto') File "/home/user/.pyenv/versions/3.10.6/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 561, in from_pretrained return model_class.from_pretrained( File "/home/user/.pyenv/versions/3.10.6/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3264, in from_pretrained resolved_archive_file, sharded_metadata = get_checkpoint_shard_files( File "/home/user/.pyenv/versions/3.10.6/lib/python3.10/site-packages/transformers/utils/hub.py", line 1038, in get_checkpoint_shard_files cached_filename = cached_file( File "/home/user/.pyenv/versions/3.10.6/lib/python3.10/site-packages/transformers/utils/hub.py", line 398, in cached_file resolved_file = hf_hub_download( File "/home/user/.pyenv/versions/3.10.6/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, **kwargs) File "/home/user/.pyenv/versions/3.10.6/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1492, in hf_hub_download http_get( File "/home/user/.pyenv/versions/3.10.6/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 532, in http_get raise RuntimeError( RuntimeError: An error occurred while downloading using `hf_********`. Consider disabling HF_HUB_ENABLE_HF_TRANSFER for better error handling.

Container logs:

Fetching error logs...