This is a merge of the below models/LoRAs. Merge was done at a 1:1 ratio. LLongMA-2-13b-16k Kimiko_13B GPTQ quantization is available in a separate repo