deepgrove-team phh commited on
Commit
5d836f3
·
verified ·
1 Parent(s): 94a69f3

Fix model path in the transformers example code (#1)

Browse files

- Fix model path in the transformers example code (d9fa61dd6411ef9e8c55d2bb929a5db441fd3f97)


Co-authored-by: Pierre-Hugues Husson <phh@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -32,8 +32,8 @@ Bonsai can be easily used through the Huggingface Transformers library. However,
32
  ```{python}
33
  from transformers import AutoTokenizer, AutoModelForCausalLM
34
 
35
- tokenizer = AutoTokenizer.from_pretrained("hespere-ai/Bonsai", trust_remote_code=True)
36
- model = AutoModelForCausalLM.from_pretrained("hespere-ai/Bonsai", trust_remote_code=True)
37
  text = "What is the capital of France?"
38
  inputs = tokenizer(text, return_tensors="pt")
39
  outputs = model.generate(**inputs, max_length=100)
 
32
  ```{python}
33
  from transformers import AutoTokenizer, AutoModelForCausalLM
34
 
35
+ tokenizer = AutoTokenizer.from_pretrained("deepgrove/Bonsai", trust_remote_code=True)
36
+ model = AutoModelForCausalLM.from_pretrained("deepgrove/Bonsai", trust_remote_code=True)
37
  text = "What is the capital of France?"
38
  inputs = tokenizer(text, return_tensors="pt")
39
  outputs = model.generate(**inputs, max_length=100)