YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Experimental aether_x method (version 81/29) is extremely compute heavy with 71 global yaml parameters. This is a massive pipeline that would benefit from further refinement to all the functions. Some of the long-awaited "brain surgery" features have been added.

This took 26 hours to merge despite only using max_iter: 32. It may not perform as well as https://DarkArtsForge/Morbid-Miasma-12B due to such extreme processing, but should produce interesting results. Same donors but with total weights = 1.5

architecture: MistralForCausalLM
merge_method: aether_x
base_model: B:/12B/mistralai--Mistral-Nemo-Instruct-2407
models:
  - model: B:/12B/Sorihon--Moonlit-Mirage-12B-Heretic
    parameters:
      weight: 0.25
  - model: B:/12B/MuXodious--Irix-12B-Model_Stock-absolute-heresy
    parameters:
      weight: 0.25
  - model: B:/12B/EldritchLabs--KrakenSakura-Maelstrom-12B-v1
    parameters:
      weight: 0.25
  - model: B:/12B/Sorihon--KansenSakura-Erosion-RP-12B-heretic
    parameters:
      weight: 0.25
  - model: B:/12B/MuXodious--QuasiStarSynth-12B-absolute-heresy
    parameters:
      weight: 0.25
  - model: B:/12B/Naphula--Ancient-Awakening-12B-MPOA
    parameters:
      weight: 0.25
Downloads last month
81
Safetensors
Model size
12B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for DarkArtsForge/Morbid-Aether-X-12B

Quantizations
2 models