Instructions to use zai-org/chatglm2-6b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use zai-org/chatglm2-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm2-6b", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
You may consider adding `ignore_mismatched_sizes=True` in the model `from_pretrained` method.
#56
by EthanMiao - opened
ERROR 2023-07-17 11:38:27,049-1d: Error(s) in loading state_dict for BertModel:
size mismatch for embeddings.word_embeddings.weight: copying a param with shape torch.Size([21128, 1024]) from checkpoint, the shape in current model is torch.Size([21128, 768]).
have you guys met this problem?