Text Generation

Software

Name Description Status Language License
TTG Thai Text Generator active Python 3.X Apache License 2.0

Pretrained

Name Detail Owner Download
Flax's GPT-2 base GPT-2 Base Thai is a causal language model based on the OpenAI GPT-2 model. It was trained on the OSCAR dataset, specifically the unshuffled_deduplicated_th subset. The model was trained from scratch and achieved an evaluation loss of 1.708 and an evaluation perplexity of 5.516. Flax Community Hugging Face
GPT-Neo GPT-Neo 1.3B is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 1.3B represents the number of parameters of this particular pre-trained model. (It is not training for Thai but It's can working with Thai) EleutherAI Hugging Face
Thai GPT Next It is fine-tune the GPT-Neo model for Thai language. Wannaphong Phatthiyaphaibun GitHub