<img src="https://w.forfun.com/fetch/cb/cba2205390e517bea1ea60ca0b491af4.jpeg" style="width: 70%; min-width: 300px; display: block; margin: auto;">

An experimental merging of Several Models using two various methods, Ties-Merge and BlockMerge_Gradient

I plan for this to be the base of my Model with my own [Stheno: ERP-Based LORA] merged in, some time in the future.

Stheno: <br>Gradient Merge of Stheno-P1 & Stheno-P2.

SISTER MODEL HERE: Stheno-Inverted-L2-13B

Quants courtesy of TheBloke! <br>GPTQ <br>GGUF <br>GGML

Test Checklist: <br>Censorship - Fairly Uncensored <br>Writing - Good Prose, Fairly Descriptive <br>NSFW - Yes <br>IQ Level - Pretty Smart <br>Formatting - Proper Formatting with Examples

Stheno-P1 [Ties-Merge] <br>-----elinas/chronos-13b-v2 <br>-----jondurbin/airoboros-l2-13b-2.1 <br>-----NousResearch/Nous-Hermes-Llama2-13b+nRuaif/Kimiko-v2 LORA

Stheno-P2 [Ties-Merge] <br>-----CalderaAI/13B-Legerdemain-L2+lemonilia/limarp-llama2-v2 LORA <br>-----ehartford/WizardLM-1.0-Uncensored-Llama2-13b <br>-----Henk717/spring-dragon

Most formats could work, but my tests have all been done in Alpaca format and it works well.

### Instruction:
Your instruction or question here.
For roleplay purposes, I suggest the following - Write <CHAR NAME>'s next reply in a chat between <YOUR NAME> and <CHAR NAME>. Write a single reply only.

### Response:

Below is the Illustration for the Final Merge:

ILLUSTRATION

Once Again, thanks to Chargoddard for his amazing and simple ties-merge script, and Gryphe for their great BlockMerge_Gradient script. Thanks to the original model creators too!

support me here :)

Art by wada_kazu / わだかず (pixiv page private?)