Resolving Interference When Merging Models
Paper • 2306.01708 • Published • 18
Models Merged:
1. DoppelReflEx/MN-12B-FoxFrame-Miyuri
2. pot99rta/Patricide-12B-Forgottenslop-Mell
Preset:
Use ChatML or Mistral
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using pot99rta/Patricide-12B-Forgottenslop-Mell as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: pot99rta/Patricide-12B-Forgottenslop-Mell
#no parameters necessary for base model
- model: pot99rta/Patricide-12B-Forgottenslop-Mell
parameters:
density: 0.5
weight: 0.5
- model: DoppelReflEx/MN-12B-FoxFrame-Miyuri
parameters:
density: 0.5
weight: 0.3
merge_method: ties
base_model: pot99rta/Patricide-12B-Forgottenslop-Mell
parameters:
normalize: false
int8_mask: true
dtype: float16