🐙 theoracle/hplovecraft Gemma-2B-IT finetuned on Lovecraft’s cosmic-horror corpus Overview

theoracle/hplovecraft is a LoRA-finetuned version of google/gemma-2-2b-it, trained on the TristanBehrens/lovecraftcorpus dataset using AutoTrain Advanced.

The objective of this model is to reproduce the literary tone and thematic patterns typical of H. P. Lovecraft, including:

dense atmospheric descriptions

archaic vocabulary and formal cadence

cosmic dread and metaphysical terror

first-person “confessional” narration

references to forbidden knowledge, ancient cults, and non-Euclidean horrors

This model is intended for creative writing, fiction generation, and experimentation with stylistic conditioning.

Usage

Minimal working example:

from transformers import pipeline

pipe = pipeline( "text-generation", model="theoracle/hplovecraft", max_new_tokens=300, temperature=0.9, top_p=0.9, )

prompt = "At dusk, I heard the distant cry of something not meant for human ears..." print(pipe(prompt)[0]["generated_text"])

Training Details

Base model: google/gemma-2-2b-it

Method: LoRA (PEFT)

Trainer: AutoTrain Advanced

Dataset: TristanBehrens/lovecraftcorpus

Task: Supervised fine-tuning for causal LM

Block size: 1024

Epochs: 2

Precision: FP16

Quantization: INT4 during training (bitsandbytes)

Strengths

Strong stylistic fidelity to Lovecraft’s prose

Produces long, immersive horror passages

Good at evoking dread, ancient mythos, and cosmic insignificance

Maintains archaic tone without collapsing into incoherence

Limitations

May generate dark or disturbing content (intended for horror writing)

Not tuned for factual or instructional tasks

May over-use specific Lovecraft tropes when prompted repeatedly

Acknowledgements

Google for the Gemma family

Tristan Behrens for the dataset

Hugging Face AutoTrain for the training framework

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for theoracle/hplovecraft

Base model

google/gemma-2-2b
Adapter
(312)
this model

Dataset used to train theoracle/hplovecraft