首页
(current)
分类
标签
排行榜
在线工具
搜索
注册
登录
小鹤
Pre-train
Pre-train
语言:中文 | 章节:10 | 阅读:967 | 收藏:0 | 评论:0
2021年7月
阅读
收藏
打赏
分享
举报
目录
评论
关于预训练模型
1. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
2. Language Models are Unsupervised Multitask Learners
3. ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
4. XLNet: Generalized Autoregressive Pretraining for Language Understanding
5. Cross-lingual Language Model Pretraining
6. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
7. DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation
8. PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization
善良比聪明更重要(评论内容审核后才会显示)
分享,让知识传承更久远
×
文档下载
×
请下载您需要的格式的文档,随时随地,享受汲取知识的乐趣!
PDF
文档
EPUB
文档
MOBI
文档