runtime error
Exit code: 1. Reason: Loading tokenizer and model... tokenizer_config.json: 0%| | 0.00/1.29k [00:00<?, ?B/s][A tokenizer_config.json: 100%|██████████| 1.29k/1.29k [00:00<00:00, 10.5MB/s] vocab.json: 0%| | 0.00/2.78M [00:00<?, ?B/s][A vocab.json: 100%|██████████| 2.78M/2.78M [00:00<00:00, 3.84MB/s] merges.txt: 0%| | 0.00/1.67M [00:00<?, ?B/s][A merges.txt: 100%|██████████| 1.67M/1.67M [00:00<00:00, 49.9MB/s] tokenizer.json: 0%| | 0.00/7.03M [00:00<?, ?B/s][A tokenizer.json: 100%|██████████| 7.03M/7.03M [00:00<00:00, 67.1MB/s] config.json: 0%| | 0.00/661 [00:00<?, ?B/s][A config.json: 100%|██████████| 661/661 [00:00<00:00, 3.38MB/s] model.safetensors: 0%| | 0.00/1.24G [00:00<?, ?B/s][A model.safetensors: 17%|█▋ | 210M/1.24G [00:01<00:05, 193MB/s][A model.safetensors: 41%|████ | 505M/1.24G [00:02<00:02, 247MB/s][A model.safetensors: 100%|█████████▉| 1.24G/1.24G [00:02<00:00, 422MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 16, in <module> model = AutoModelForCausalLM.from_pretrained(model_name).to(device) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 279, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4333, in from_pretrained model_init_context = cls.get_init_context(is_quantized, _is_ds_init_called) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3736, in get_init_context init_contexts = [no_init_weights(), init_empty_weights()] NameError: name 'init_empty_weights' is not defined
Container logs:
Fetching error logs...