llama2 llama

Similar to llama2-22b, but with BLOCK_DIAGONAL=false in the merge and twice the fine-tuning tokens.

Again, not intended for direct use - meant as a base for further tuning and merging.