This is model is a simple extension to (mosaicml/mpt-30b)[https://huggingface.co/mosaicml/mpt-30b]. The alibi positional embedding is first manually interpolated by 2x, then extrapolated by another 2x, hence giving 4x the original context length in total.