EXL2 Quantization of l2-13b-thespurral-v1.
GGUF here
Model details
Quantized at 5.33bpw and 6.13bpw
This model is very good (at least for me) for role-playing, I liked it a lot. Visit the model repo for more details.
EXL2 Quantization of l2-13b-thespurral-v1.
GGUF here
Quantized at 5.33bpw and 6.13bpw
This model is very good (at least for me) for role-playing, I liked it a lot. Visit the model repo for more details.