<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1592314373317558274/kWBIBveR_400x400.jpg')"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1569305276343369729/9tyrIoYq_400x400.jpg')"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1423875044598456321/SVjwd6Bb_400x400.jpg')"> </div> </div> <div style="text-align: center; font-size: 16px; font-weight: 800">Layah Heilpern & Dovey "Rug The Fiat" Wan & Irene Zhao</div> <div style="text-align: center; font-size: 14px;">@doveywan-irenezhao_-layahheilpern</div> </div>
How does it work?
The model uses the following pipeline.
Training data
The model was trained on tweets from Layah Heilpern & Dovey "Rug The Fiat" Wan & Irene Zhao.
Data | Layah Heilpern | Dovey "Rug The Fiat" Wan | Irene Zhao |
---|---|---|---|
Tweets downloaded | 3249 | 3247 | 1945 |
Retweets | 115 | 310 | 223 |
Short tweets | 1453 | 269 | 417 |
Tweets kept | 1681 | 2668 | 1305 |
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on @doveywan-irenezhao_-layahheilpern's tweets.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
How to use
You can use this model directly with a pipeline for text generation:
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/cag')
generator("In crypto, ", num_return_sequences=5)
Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
About
Built by Gigabrain