Neuro-Orchestrator-8B
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using miromind-ai/MiroThinker-v1.0-8B as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
name: Neuro-Orchestrator-8B
merge_method: ties
base_model: miromind-ai/MiroThinker-v1.0-8B
parameters:
normalize: true
int8_mask: true
dtype: bfloat16
tokenizer_source: union # Critical: Merges special tokens from all models
models:
# Base Model: MiroThinker (The Body/Executioner)
# We use this as the base so the resulting model defaults to its
# superior 256k context and tool-use templates.
- model: miromind-ai/MiroThinker-v1.0-8B
parameters:
density: 1.0
weight: 1.0
# Model 2: Nvidia Nemotron Orchestrator (The Manager)
# We want high density (coverage) but lower weight to guide
# logic without breaking the tool syntax.
- model: nvidia/Nemotron-Orchestrator-8B
parameters:
density: 0.8
weight: 0.4
# Model 3: Kwaipilot HiPO (The Controller)
# We mainly want its 'gating' logic (when to think).
# A lower density focuses on the most critical policy changes.
- model: Kwaipilot/HiPO-8B
parameters:
density: 0.5
weight: 0.3
- Downloads last month
- -
Model tree for yasserrmd/Neuro-Orchestrator-8B
Merge model
this model