hawk

HawkLM-Chat-demo

<p align="center"> <a href="https://huggingface.co/Rexopia/HawkLM-demo">HawkLM-demo 🤗</a>&nbsp | <a href="https://huggingface.co/Rexopia/HawkLM-Chat-demo">HawkLM-Chat-demo 🤗</a> </p>

Model Details

How to Get Started with the Model

Use the code below to get started with the model.

from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("Rexopia/HawkLM-Chat-demo", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("Rexopia/HawkLM-Chat-demo", device_map="auto", trust_remote_code=True)

Training Data

We sampled English-only corpus from Redpajama-1T datasets without any Arxiv and GitHub tags. As the demo version presented, we only trained 3.3Bil tokens.

Evaluation

[More Information Needed]

Citation

[More Information Needed]

Model Card Contact

[More Information Needed]