Taiwan LLM based on LLaMa2-7b
continue pretraining on 20 billion tokens in traditional mandarin and instruction fine-tuning on millions of conversations.
This version does NOT include commoncrawl.
๐ Checkout New Taiwan-LLM UI ๐
Collaboration with Ubitus K.K. ๐ช๐ช๐ช
ๆฌ้ ็ฎ่ Ubitus K.K. ๅไฝ้ฒ่กใUbitus ็บๆฌ้ ็ฎๆไพๅฏถ่ฒด็ๆ่กๆฏๆๅ่จ็ฎ่ณๆบใ
Taiwan LLM v2 is conducted in collaboration with Ubitus K.K.. Ubitus provides valuable technical support and compute resources for the project.