harshit36/Decision-driven-Stories
Viewer • Updated • 1.6k • 4 • 2
How to use harshit36/NOVA-Verse with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="harshit36/NOVA-Verse") # Load model directly
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("harshit36/NOVA-Verse", dtype="auto")How to use harshit36/NOVA-Verse with vLLM:
# Install vLLM from pip:
pip install vllm
# Start the vLLM server:
vllm serve "harshit36/NOVA-Verse"
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:8000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "harshit36/NOVA-Verse",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker model run hf.co/harshit36/NOVA-Verse
How to use harshit36/NOVA-Verse with SGLang:
# Install SGLang from pip:
pip install sglang
# Start the SGLang server:
python3 -m sglang.launch_server \
--model-path "harshit36/NOVA-Verse" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "harshit36/NOVA-Verse",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker run --gpus all \
--shm-size 32g \
-p 30000:30000 \
-v ~/.cache/huggingface:/root/.cache/huggingface \
--env "HF_TOKEN=<secret>" \
--ipc=host \
lmsysorg/sglang:latest \
python3 -m sglang.launch_server \
--model-path "harshit36/NOVA-Verse" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "harshit36/NOVA-Verse",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'How to use harshit36/NOVA-Verse with Docker Model Runner:
docker model run hf.co/harshit36/NOVA-Verse
# Load model directly
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("harshit36/NOVA-Verse", dtype="auto")
A NOVA Finetuned model which is specifically trained for decision-driven Story generator.
NOVA)nova (Fine-tuned version)PreTrainedTokenizerFasttransformers via custom AutoModel and AutoConfig registration.| File | Description |
|---|---|
config.json |
Configuration of model hyperparameters |
model.safetensors |
Serialized model weights (efficient format) |
nova_modelling.py |
Custom model and config class definitions |
tokenizer.json |
Serialized tokenizer |
tokenizer_config.json |
Tokenizer configuration metadata |
special_tokens_map.json |
Mapping for special tokens (e.g., BOS, EOS) |
README.md |
Model card (you’re reading it!) |
NovaForCausalLM
The model consists of:
NovaConfig)
{
"model_type": "nova",
"vocab_size": 6000,
"block_size": 256,
"n_embd": 640,
"n_layer": 4,
"n_head": 8
}
nova_modelling.py)
git clone https://huggingface.co/harshit36/Nova-Verse
cd Nova-Verse
import sys
sys.path.append("./Nova-Verse/") # add current dir to path
from transformers import PreTrainedTokenizerFast
from nova_modelling import NovaConfig, NovaForCausalLM
# Load tokenizer
tokenizer = PreTrainedTokenizerFast.from_pretrained("harshit36/Nova-Verse")
# Load config
config = NovaConfig.from_pretrained("harshit36/Nova-Verse")
# Instantiate model using your custom class
model = NovaForCausalLM(config)
model = model.from_pretrained("harshit36/Nova-Verse")
# Use the model
input_ids = tokenizer("Hello world", return_tensors="pt").input_ids
output = model.generate(input_ids)
print(tokenizer.decode(output[0], skip_special_tokens=True).replace(" ","").replace("Ä "," ").replace("ÄŠ","\n"))
Story text generation
Hybrid Positional Encoding Research model (Combination of Sinusoidal and learnable encodings)
Educational demonstrations of custom HF model integration
Rapid prototyping of transformer models
Base model
harshit36/Nova-Casual-LLM
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="harshit36/NOVA-Verse")