⚠️ Note: This model requires ChatML chat template.

🦁 MagMalion Twilight 12B v1

This is a merge of pre-trained language models created using mergekit.

The model is partially censored but can be jailbroken or ablated if needed.

MagMalion-Twilight

Merge Details

Merge Method

This model was merged using the DELLA merge method using IntervitensInc/Mistral-Nemo-Base-2407-chatml as a base.

Models Merged

The following models were included in the merge:

  • IntervitensInc/Mistral-Nemo-Base-2407-chatml
  • GreenerPastures/Golden-Curry-12B
  • inflatebot/MN-12B-Mag-Mell-R1
  • Sao10K/MN-12B-Lyra-v2a1
  • ChaoticNeutrals/Mag-Mell-Reasoner-12B
  • Epiculous/Crimson_Dawn-v0.2
  • Epiculous/Azure_Dusk-v0.2
  • Epiculous/Violet_Twilight-v0.2
  • PygmalionAI/Pygmalion-3-12B
  • PygmalionAI/Eleusis-12B

Audit

MagMalionTwilight_Audit

Configuration

The following YAML configuration was used to produce this model:

base_model: B:/12B/models--IntervitensInc--Mistral-Nemo-Base-2407-chatml
models:
  - model: B:/12B/models--IntervitensInc--Mistral-Nemo-Base-2407-chatml
  - model: B:/12B/models--ChaoticNeutrals--Mag-Mell-Reasoner-12B
    parameters:
      density: 0.9
      weight: 0.2
      epsilon: 0.099
  - model: B:/12B/models--Epiculous--Azure_Dusk-v0.2
    parameters:
      density: 0.9
      weight: 0.2
      epsilon: 0.099
  - model: B:/12B/models--Epiculous--Crimson_Dawn-v0.2
    parameters:
      density: 0.9
      weight: 0.2
      epsilon: 0.099
  - model: B:/12B/models--Epiculous--Violet_Twilight-v0.2
    parameters:
      density: 0.9
      weight: 0.2
      epsilon: 0.099
  - model: B:/12B/models--GreenerPastures--Golden-Curry-12B
    parameters:
      density: 0.9
      weight: 0.2
      epsilon: 0.099
  - model: B:/12B/models--inflatebot--MN-12B-Mag-Mell-R1
    parameters:
      density: 0.9
      weight: 0.2
      epsilon: 0.099
  - model: B:/12B/models--PygmalionAI--Eleusis-12B
    parameters:
      density: 0.9
      weight: 0.2
      epsilon: 0.099
  - model: B:/12B/models--PygmalionAI--Pygmalion-3-12B
    parameters:
      density: 0.9
      weight: 0.2
      epsilon: 0.099
  - model: B:/12B/models--Sao10K--MN-12B-Lyra-v2a1
    parameters:
      density: 0.9
      weight: 0.2
      epsilon: 0.099
merge_method: della
parameters:
  lambda: 1.0
  normalize: false
  int8_mask: false
dtype: bfloat16
tokenizer:  
  source: "union"  
  tokens:  
    # Force ChatML EOS tokens  
    "<|im_start|>":  
      source: "B:/12B/models--IntervitensInc--Mistral-Nemo-Base-2407-chatml"  
      force: true  
    "<|im_end|>":  
      source: "B:/12B/models--IntervitensInc--Mistral-Nemo-Base-2407-chatml"  
      force: true  
chat_template: "chatml"
name: 🦁 MagMalion-Twilight-12B-v1
Downloads last month
86
Safetensors
Model size
12B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for EldritchLabs/MagMalion-Twilight-12B-v1

Paper for EldritchLabs/MagMalion-Twilight-12B-v1