How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="Symbol-LLM/Symbol-LLM-13B-Instruct")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Symbol-LLM/Symbol-LLM-13B-Instruct")
model = AutoModelForCausalLM.from_pretrained("Symbol-LLM/Symbol-LLM-13B-Instruct")
Quick Links

Symbol-LLM: Towards Foundational Symbol-centric Interface for Large Language Models

Paper Link: https://arxiv.org/abs/2311.09278

Project Page: https://xufangzhi.github.io/symbol-llm-page/

πŸ”₯ News

  • πŸ”₯πŸ”₯πŸ”₯ Symbol-LLM is accepted by ACL 2024 !

  • πŸ”₯πŸ”₯πŸ”₯ We have made Symbol-LLM series models (7B / 13B) public.

Note

The work is under review.

The symbolic data collection will be public soon.

Citation

If you find it helpful, please kindly cite the paper.

@article{xu2023symbol,
  title={Symbol-LLM: Towards Foundational Symbol-centric Interface For Large Language Models},
  author={Xu, Fangzhi and Wu, Zhiyong and Sun, Qiushi and Ren, Siyu and Yuan, Fei and Yuan, Shuai and Lin, Qika and Qiao, Yu and Liu, Jun},
  journal={arXiv preprint arXiv:2311.09278},
  year={2023}
}
Downloads last month
15
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for Symbol-LLM/Symbol-LLM-13B-Instruct

Quantizations
2 models

Paper for Symbol-LLM/Symbol-LLM-13B-Instruct