Alert button

Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization

Apr 08, 2019
Yangyang Shi, Mei-Yuh Hwang, Xin Lei, Haoyu Sheng

Figure 1 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Figure 2 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Figure 3 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Figure 4 for Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: