Merge https://huggingface.co/upstage/llama-65b-instruct as donor and https://huggingface.co/Gryphe/MythoMax-L2-13b as primarly model by using frankenllama_22b.py from https://huggingface.co/chargoddard/llama2-22b.

It has 32.905b parameters as the result.