深度学习模型压缩

白天 夜间 首页 下载 阅读记录
  我的书签   添加书签   移除书签

简单总结

浏览 100 扫码 分享 2023-11-24 00:37:37

    最近看了一些关于蒸馏的文章,这里做一个简单的总结:

    蒸馏小结.png

    脑图的原始文档:http://naotu.baidu.com/file/f60fea22a9ed0ea7236ca9a70ff1b667?token=dab31b70fffa034a(kdxj)

    下面是做的ppt,这里转成了pdf。

    蒸馏小结.pdf

    Google Slide: https://docs.google.com/presentation/d/e/2PACX-1vSsa5X_zfuJUPgxUL7vu8MHbkj3JnUzIlKbf-eXkYivhwiFZRVx_NqhSxBbYDu-1c2D7ucBX_Rlf9kD/pub?start=false&loop=false&delayms=3000

    若有收获,就点个赞吧

    0 人点赞

    上一篇:
    下一篇:
    • 书签
    • 添加书签 移除书签
    • 相关资料
      • 相关资料
      • A Survey of Model Compression and Acceleration for Deep Neural Networks
    • Knowledge Distillation
      • 简单总结
      • (ICLR 2015) FITNETS: HINTS FOR THIN DEEP NETS
      • (ICLR 2017) Paying More Attention to Attention
      • (CVPR 2017) Mimicking Very Efficient Network for Object Detection
      • (CVPR 2017) A Gift from Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning
      • (Arxiv 2017) Like What You Like - Knowledge Distill via Neuron Selectivity Transfer
      • (Arxiv 2017) Data Distillation: Towards Omni-Supervised Learning
      • (NIPS 2017) Learning Efficient Object Detection Models with Knowledge Distillation
      • (CVPR 2018) Deep Mutual Learning
      • (NeurIPS 2018) Paraphrasing Complex Network: Network Compression via Factor Transfer
      • (ICCV 2019) Learning Lightweight Lane Detection CNNs by Self Attention Distillation
      • (BMVC 2019) Graph-based Knowledge Distillation by Multi-head Attention Network
      • (ICCV 2019) Distilling Knowledge From a Deep Pose Regressor Network
      • (ICCV 2019) Similarity-Preserving Knowledge Distillation
      • (CVPR 2019) Knowledge Adaptation for Efficient Semantic Segmentation
      • (CVPR 2019) Relational Knowledge Distillation
      • (CVPR 2019) Structured Knowledge Distillation for Semantic Segmentation
      • (TIP 2021) Double Similarity Distillation for Semantic Image Segmentation
      • (TIP 2022) Spot-adaptive Knowledge Distillation
      • (CVPR 2022) Cross-Image Relational Knowledge Distillation for Semantic Segmentation
    • Pruning
      • CondenseNet: An Efficient DenseNet using Learned Group Convolutions
    • Compact Model
      • ENet
    暂无相关搜索结果!

      让时间为你证明

      展开/收起文章目录

      分享,让知识传承更久远

      文章二维码

      手机扫一扫,轻松掌上读

      文档下载

      请下载您需要的格式的文档,随时随地,享受汲取知识的乐趣!
      PDF文档 EPUB文档 MOBI文档

      书签列表

        阅读记录

        阅读进度: 0.00% ( 0/0 ) 重置阅读进度

          思维导图备注