chess-littletestmodel

Chess model for LLM Course Challenge.

  • By: MDaytek
  • Params: 790,560
  • Architecture: GPT-2 with custom tokenizer

Usage

The model uses a custom tokenizer. Load it with:

from transformers import GPT2LMHeadModel, AutoConfig
import json

config = AutoConfig.from_pretrained("LLM-course/chess-littletestmodel")
model = GPT2LMHeadModel.from_pretrained("LLM-course/chess-littletestmodel", config=config)

# Load vocab
with open("vocab.json") as f:
    vocab = json.load(f)
Downloads last month
7
Safetensors
Model size
791k params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support