Alert button
Picture for Guangda Ji

Guangda Ji

Alert button

Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher

Add code
Bookmark button
Alert button
Oct 20, 2020
Guangda Ji, Zhanxing Zhu

Figure 1 for Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher
Figure 2 for Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher
Figure 3 for Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher
Figure 4 for Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher
Viaarxiv icon