一、DistilBert for Chinese 海量中文预训练蒸馏Bert模型
拟于12月16日发布 target to release on Dec 16th.
拟发布内容 Contents:
1.1 可下载的蒸馏模型,已经训练过
a pretrained chinese DistilBert, others can use it directly or trained again on their own corpus;
1.2 可用于下游任务的例子和代码,包括3个ChineseGLUE(CLUE)的任务
fine tuning examples and codes using DistilBert on three ChineseGLUE(CLUE) tasks;
1.3 小模型基准测评
performance comparsion with albert_tiny, ernie
1