0、综述、进展、总结

复旦邱锡鹏团队:Transformer最新综述!
复旦大学邱锡鹏教授团队:Transformer最新综述
NLP实操手册: 基于Transformer的深度学习架构的应用指南(综述)

不得不看!降低Transformer复杂度的方法
Transformer总结-2022版
一篇非常好的transformer年度总结
一文梳理Visual Transformer:与CNN相比,ViT赢在哪儿?

一、Transformer基础

Transformer:Attention Is All You Need

  • Ashish Vaswani, Noam Shazeer, Niki Parmar, et al. Attention Is All You Need[C]. In NeurIPS 2017.

Transformer代码完全解读!
图解Transformer(完整版)!
Transformer温故知新
【务实基础】Transformer
矩阵视角下的Transformer详解(附代码)
Transformer代码完全解读!
「课代表来了」跟李沐读论文之——Transformer
深度学习基础 | 超详细逐步图解 Transformer

二、Transfomer问答

Transformer十问十答
关于Transformer,面试官们都怎么问
10个重要问题概览Transformer全部内容

三、Transformer推荐

AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

  • Weiping Song, Chence Shi, Zhiping Xiao, et al. AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks[C]. In CIKM 2019.

Personalized Re-ranking for Recommendation

  • Fei Sun, Xiao Lin, Hanxiao Sun, et al. Personalized Re-ranking for Recommendation[C]. In RecSys 2019.

SDM: Sequential Deep Matching Model for Online Large-scale Recommender System

  • Fuyu Lv, Taiwei Jin, Changlong Yu, et al. SDM: Sequential Deep Matching Model for Online Large-scale Recommender System[C]. In CIKM 2019

BST: Behavior Sequence Transformer for E-commerce Recommendation in Alibaba

  • Qiwei Chen, Huan Zhao, Wei Li, et al. Behavior Sequence Transformer for E-commerce Recommendation in Alibaba[C]. In DLP-KDD 2019.

BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer

  • Fei Sun, Jun Liu, Jian Wu, et al. BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer[C]. In CIKM 2019.

SSE-PT: Sequential Recommendation Via Personalized Transformer

  • Liwei Wu, Shuqing Li, Cho-Jui Hsieh, et al. SSE-PT: Sequential Recommendation Via Personalized Transformer[C]. In RecSys 2020.

DMT: Deep Multifaceted Transformers for Multi-objective Ranking in Large-Scale E-commerce Recommender Systems

  • Yulong Gu, Zhuoye Ding, Shuaiqiang Wang, et al. Deep Multifaceted Transformers for Multi-objective Ranking in Large-Scale E-commerce Recommender Systems[C]. In CIKM 2020.

Multiplex Behavioral Relation Learning for Recommendation via Memory Augmented Transformer Network

  • Lianghao Xia, Chao Huang, Yong Xu, et al. Multiplex Behavioral Relation Learning for Recommendation via Memory Augmented Transformer Network[C]. In SIGIR 2020.

Knowledge-Enhanced Hierarchical Graph Transformer Network for Multi-Behavior Recommendation

  • Lianghao Xia, Chao Huang, Yong Xu, et al. Knowledge-Enhanced Hierarchical Graph Transformer Network for Multi-Behavior Recommendation[C]. In AAAI 2020.

Personalized Transformer for Explainable Recommendation

  • Lei Li, Yongfeng Zhang, Li Chen. Personalized Transformer for Explainable Recommendation[C]. In ACL 2021.

Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting

  • Haoyi Zhou, Shanghang Zhang, Jieqi Peng, et al. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting[C]. In AAAI 2021.

Pre-training Graph Transformer with Multimodal Side Information for Recommendation

  • Yong Liu, Susen Yang, Chenyi Lei, et al. Pre-training Graph Transformer with Multimodal Side Information for Recommendation[J]. In ACM Multimedia 2021.

Transformers4Rec: Bridging the Gap between NLP and Sequential / Session-Based Recommendation

  • G. S. Moreira, Sara Rabhi, Jeong Min Lee, et al. Transformers4Rec: Bridging the Gap between NLP and Sequential / Session-Based Recommendation[C]. In RecSys 2021.

Continuous-Time Sequential Recommendation with Temporal Graph Collaborative Transformer

  • Ziwei Fan, Zhiwei Liu, Jiawei Zhang, et al. Continuous-Time Sequential Recommendation with Temporal Graph Collaborative Transformer[C]. In CIKM 2021.

Augmenting Sequential Recommendation with Pseudo-Prior Items via Reversely Pre-training Transformer

  • Zhiwei Liu, Ziwei Fan, Yu Wang, et al. Augmenting Sequential Recommendation with Pseudo-Prior Items via Reversely Pre-training Transformer[C]. In SIGIR 2021.