mT5: Multilingual T5 |
Multilingual T5 (mT5) is a massively multilingual pretrained text-to-text transformer model, trained following a similar recipe as T5. |
mT5: A massively multilingual pre-trained text-to-text transformer |
GitHub |
BertSum |
Trained Model by Nakhun Chumpolsathien & Tanachat Arayachutinan |
Using Knowledge Distillation from Keyword Extraction to Improve the Informativeness of Neural Cross-lingual Summarization |
GitHub |
ARedSum |
Trained Model by Nakhun Chumpolsathien & Tanachat Arayachutinan |
Using Knowledge Distillation from Keyword Extraction to Improve the Informativeness of Neural Cross-lingual Summarization |
GitHub |
TNCLS |
Trained Model from ThaiCrossSum Corpora by Nakhun Chumpolsathien |
|
GitHub |
CLS+MS |
Trained Model from ThaiCrossSum Corpora by Nakhun Chumpolsathien |
|
GitHub |
CLS+MT |
Trained Model from ThaiCrossSum Corpora by Nakhun Chumpolsathien |
|
GitHub |
XLS – RL-ROUGE |
Trained Model from ThaiCrossSum Corpora by Nakhun Chumpolsathien |
|
GitHub |
mt5-cpe-kmutt-thai-sentence-sum |
This repository contains the finetuned mT5-base model for Thai sentence summarization. |
|
huggingface |