Pre-train
白天
夜间
首页
下载
阅读记录
书签管理
我的书签
添加书签
移除书签
1. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
浏览
273
扫码
分享
2022-07-23 01:06:46
若有收获,就点个赞吧
0 人点赞
上一篇:
下一篇:
关于预训练模型
1. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
2. Language Models are Unsupervised Multitask Learners
3. ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
4. XLNet: Generalized Autoregressive Pretraining for Language Understanding
5. Cross-lingual Language Model Pretraining
6. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
7. DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation
8. PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization
暂无相关搜索结果!
让时间为你证明
分享,让知识传承更久远
×
文章二维码
×
手机扫一扫,轻松掌上读
文档下载
×
请下载您需要的格式的文档,随时随地,享受汲取知识的乐趣!
PDF
文档
EPUB
文档
MOBI
文档
书签列表
×
阅读记录
×
阅读进度:
0.00%
(
0/0
)
重置阅读进度
×
思维导图备注