NLP – 有组织在! https://uzzz.org/ Sun, 23 Feb 2020 06:55:36 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.4 https://uzzz.org/wp-content/uploads/2019/10/cropped-icon-32x32.png NLP – 有组织在! https://uzzz.org/ 32 32 BERT and Knowledge Distillation https://uzzz.org/article/3892/ Sun, 23 Feb 2020 06:55:36 +0000 https://uzzz.org/article/3892/ 知识蒸馏

知识蒸馏(Knowledge Distillation,KD)是想将复杂模型(teacher network)中的暗知识(dark knowledge)迁移到简单模型(student network)中。一般来说,老师网络具有强大的能力和表现,而学生网络则更为紧凑。通过知识蒸馏,希望学生网络能尽可能逼近亦或是超过老师网络,从而用复杂度更小的模型来获得类似的预测效果。Hinton在Distilling…

The post BERT and Knowledge Distillation appeared first on 有组织在!.

]]>
The post BERT and Knowledge Distillation appeared first on 有组织在!.

]]>