runtime error

model-00001-of-00002.safetensors: 76%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 3.82G/5.00G [00:28<00:08, 140MB/s] model-00001-of-00002.safetensors: 83%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 4.15G/5.00G [00:29<00:04, 175MB/s] model-00001-of-00002.safetensors: 88%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 4.41G/5.00G [00:30<00:03, 193MB/s] model-00001-of-00002.safetensors: 93%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž| 4.67G/5.00G [00:33<00:02, 159MB/s] model-00001-of-00002.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰| 5.00G/5.00G [00:34<00:00, 145MB/s] Downloading shards: 50%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 1/2 [00:34<00:34, 34.65s/it] model-00002-of-00002.safetensors: 0%| | 0.00/564M [00:00<?, ?B/s] model-00002-of-00002.safetensors: 4%|β–Ž | 21.0M/564M [00:01<00:27, 19.7MB/s] model-00002-of-00002.safetensors: 72%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 407M/564M [00:02<00:00, 229MB/s]  model-00002-of-00002.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰| 564M/564M [00:02<00:00, 253MB/s] Downloading shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:37<00:00, 15.71s/it] Downloading shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:37<00:00, 18.55s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 40, in <module> model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.float32, device_map="auto", trust_remote_code=True) File "/home/user/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 566, in from_pretrained return model_class.from_pretrained( File "/home/user/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3606, in from_pretrained no_split_modules = model._get_no_split_modules(device_map) File "/home/user/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1690, in _get_no_split_modules raise ValueError( ValueError: PhiForCausalLM does not support `device_map='auto'`. To implement support, the model class needs to implement the `_no_split_modules` attribute.

Container logs:

Fetching error logs...