BAP-Labs-M1
LoRA adapter for Hermes-2-Pro-Mistral-7B fine-tuned to generate Serum synthesizer preset parameters from natural language descriptions.
Model Details
- Base Model: NousResearch/Hermes-2-Pro-Mistral-7B
- Fine-tuning Method: LoRA (Low-Rank Adaptation)
- Training Framework: MLX-LM (Apple Silicon optimized)
- Task: Text-to-Synthesizer-Parameters generation
Training Configuration
- Dataset: 897 examples of natural language descriptions paired with Serum preset parameters
- Training Split: 90/10 (807 train / 90 validation)
- Iterations: 300
- Batch Size: 8
- Learning Rate: 3e-4
- LoRA Layers: 16
- Trainable Parameters: 1.704M (0.024% of base model)
Training Results
- Final Train Loss: 0.782
- Final Validation Loss: 0.757
- Training Time: ~5.5 hours on M4 Max
- Peak Memory: 62.8 GB
Usage
With MLX-LM
from mlx_lm import load, generate
# Load model with LoRA adapter
model, tokenizer = load(
"NousResearch/Hermes-2-Pro-Mistral-7B",
adapter_path="bapinero/BAP-Labs-M1"
)
# Generate Serum preset parameters
prompt = """<|im_start|>system
You are a Serum synthesizer preset designer. Generate JSON parameter changes for Serum presets based on natural language descriptions.<|im_end|>
<|im_start|>user
Create a deep dubstep bass with lots of wobble<|im_end|>
<|im_start|>assistant
"""
response = generate(model, tokenizer, prompt=prompt, max_tokens=500, temp=0.7)
print(response)
Command Line
mlx_lm.generate \
--model NousResearch/Hermes-2-Pro-Mistral-7B \
--adapter-path bapinero/BAP-Labs-M1 \
--prompt "Create a bright future bass lead" \
--max-tokens 500 \
--temp 0.7
Output Format
The model generates JSON with Serum parameter changes:
{
"parameter_changes": [
{"parameter_index": 22, "new_value": 0.85},
{"parameter_index": 77, "new_value": 0.12},
...
]
}
Limitations
- Trained specifically for Serum synthesizer (448 mapped parameters)
- Best results with genre-specific descriptions (dubstep, house, bass music, etc.)
- Optimized for MLX framework (Apple Silicon)
License
Based on Hermes-2-Pro-Mistral-7B. Please refer to base model license.
Citation
@misc{bap-labs-m1,
author = {BAP Labs},
title = {BAP-Labs-M1: Serum Synthesizer Control via LLM},
year = {2025},
publisher = {HuggingFace},
url = {https://huggingface.co/bapinero/BAP-Labs-M1}
}
Model tree for bapinero/BAP-Labs-M1
Base model
mistralai/Mistral-7B-v0.1
Finetuned
NousResearch/Hermes-2-Pro-Mistral-7B