runtime error

None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. tokenizer_config.json: 0%| | 0.00/1.23k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 1.23k/1.23k [00:00<00:00, 7.44MB/s] tokenizer.model: 0%| | 0.00/968k [00:00<?, ?B/s] tokenizer.model: 100%|██████████| 968k/968k [00:00<00:00, 21.6MB/s] tokenizer.json: 0%| | 0.00/2.85M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.85M/2.85M [00:00<00:00, 32.2MB/s] added_tokens.json: 0%| | 0.00/21.0 [00:00<?, ?B/s] added_tokens.json: 100%|██████████| 21.0/21.0 [00:00<00:00, 171kB/s] special_tokens_map.json: 0%| | 0.00/552 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 552/552 [00:00<00:00, 4.79MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 6, in <module> model = AutoModelForCausalLM.from_pretrained("ai4bharat/Airavata") File "/home/user/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1304, in __getattribute__ requires_backends(cls, cls._backends) File "/home/user/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1292, in requires_backends raise ImportError("".join(failed)) ImportError: AutoModelForCausalLM requires the PyTorch library but it was not found in your environment. Checkout the instructions on the installation page: https://pytorch.org/get-started/locally/ and follow the ones that match your environment. Please note that you may need to restart your runtime after installation.

Container logs:

Fetching error logs...