🐟 G4 Runic Oarfish 26B A4B v1.2

Runic_Oarfish

This is a creative RP merge which combines Musica with the full LORA of MeroMero. v1.2 also adds Darkhn/Gemma-4-26B-A4B-Animus-V14.1-FFT, another high quality RP finetune.

It uses a custom method moe_karcher which adapts the standard karcher method to support mixture of experts. A few changes were made to the script to support the new Gemma4 architecture. Note there were some issues setting up the merge, so the vision mode might be disabled.

Runic Oarfish has some refusals but can be jailbroken or ablated as needed.

moe_karcher merge with 3 models. This model produces much different output than v1 or v1.1 upon being tested.

An improvement over v1

There is still slop with the "not x, but y" prose, though it writes better otherwise. It talked about a lighthouse / cursed island instead of the clockmaker shop.

i think 1.1 isn't as good as the original, it has a lot more subtle refusal than v1, shorter replies, and more negative Gemini-like behavior. it seems that moe_karcher is better than moe_slerp.

A magnitude scan reveals that MeroMero had the highest L2 norm, followed by Animus, then Musica. This means that MeroMero had the "strongest pull" on the karcher direction.

100 iterations is enough to produce about the same fidelity as 1000

The base model gemma-4-26B-A4B-it was still chosen to be excluded for this version, but it might be added for v1.3

architecture: Gemma4ForConditionalGeneration
merge_method: moe_karcher
# base_model: B:\26B\google--gemma-4-26B-A4B-it
models:
  - model: B:\26B\AuriAetherwiing--G4-26B-A4B-Musica-v1
  - model: B:\26B\ApocalypseParty--G4-26B-SFT-6 # zerofata/G4-MeroMero-26B-A4B
  - model: B:\26B\Darkhn--Gemma-4-26B-A4B-Animus-V14.1-FFT
parameters:
  max_iter: 100
  tol: 1.0e-9
  router_strategy: karcher  # Options: karcher, average, first, random_init
  blend_experts: true  # Blend corresponding experts (expert[0] + expert[0], etc.)
dtype: float32
out_dtype: bfloat16
tokenizer:
  source: union
# chat_template: auto
trust_remote_code: true
name: G4-Runic-Oarfish-26B-A4B-v1.2

See v1 for more details of how to merge Gemma 4 MoE models.

Downloads last month
92
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Naphula/G4-Runic-Oarfish-26B-A4B-v1.2

Collection including Naphula/G4-Runic-Oarfish-26B-A4B-v1.2