状态:存档(代码按原样提供,预计无更新)
伯特·凯拉斯
Google BERT(来自Transformers的双向编码器表示)的Keras实现和OpenAI的Transformer LM能够使用微调API加载预训练的模型。
更新:得益于 TPU支持进行推理和训练
如何使用它?
# this is a pseudo code you can read an actual working example in tutorial.ipynb or the colab notebook
text_encoder = MyTextEncoder ( ** my_text_encoder_params ) # you create a text encoder (sentence piece and openai's bpe are included)
lm_generator = lm_generator ( text_encoder , ** lm_generator_params ) # this is essentially your data reader (single sente
1