THEBLOKE HAS QUANTS! <br>https://huggingface.co/TheBloke/Mythical-Destroyer-L2-13B-GPTQ <br>https://huggingface.co/TheBloke/Mythical-Destroyer-L2-13B-GGUF

<br>A Merge done for @dampf

FULL FP16 Model

<br>Base Model TheBloke/Llama-2-13B-fp16 <br> MERGED WITH <br>-----Gryphe/MythoMax-L2-13b <br>-----totally-not-an-llm/PuddleJumper-13b <br>-----TheBloke/Llama-2-13B-Chat-fp16 <br>-----rombodawg/LosslessMegaCoder-llama2-13b-mini <br>-----The-Face-Of-Goonery/Chronos-Beluga-v2-13bfp16 <br>using ties-merge

Dampf's Rationale:
if you think about it, the merges kinda act as experts in my destroyer.
mythomax and chronos-beluga for creativity,
llama 2 13b chat and puddlejumper for instruct and losslessmegacoder for logic/code
if this works well...
it should be really, really good
---
mythical destroyer will be used for rp and instruct as well as coding tasks a like
and it should be good at everything
---

<br>Script used to Merge here <br>Thank you for the easy to set up script, Chargoddard !

Command:

python ties_merge.py TheBloke/Llama-2-13B-fp16 ./Mythical-Destroyer-13B --merge Gryphe/MythoMax-L2-13b --merge totally-not-an-llm/PuddleJumper-13b --merge TheBloke/Llama-2-13B-Chat-fp16 --merge rombodawg/LosslessMegaCoder-llama2-13b-mini --merge The-Face-Of-Goonery/Chronos-Beluga-v2-13bfp16 --cuda