Instructions to use guan-wang/leta with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use guan-wang/leta with Transformers:
# Load model directly from transformers import GenericExplainer model = GenericExplainer.from_pretrained("guan-wang/leta", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 8bc7e958cbb3a0c88198a4314f9381dfe1895600da396dd197f580cce1806a42
- Size of remote file:
- 1.33 GB
- SHA256:
- 926569704edead3507819ad112842b84d186cec1878a8b38597ed3e30072dab0
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.