Taiwan LLM based on LLaMa2-7b

continue pretraining on 20 billion tokens in traditional mandarin and instruction fine-tuning on millions of conversations.

This version does NOT include commoncrawl.

๐ŸŒŸ Checkout New Taiwan-LLM UI ๐ŸŒŸ

Collaboration with Ubitus K.K. ๐Ÿ’ช๐Ÿ’ช๐Ÿ’ช

ๆœฌ้ …็›ฎ่ˆ‡ Ubitus K.K. ๅˆไฝœ้€ฒ่กŒใ€‚Ubitus ็‚บๆœฌ้ …็›ฎๆไพ›ๅฏถ่ฒด็š„ๆŠ€่ก“ๆ”ฏๆŒๅ’Œ่จˆ็ฎ—่ณ‡ๆบใ€‚

Taiwan LLM v2 is conducted in collaboration with Ubitus K.K.. Ubitus provides valuable technical support and compute resources for the project.