Alert button
Picture for Sunghwan Ahn

Sunghwan Ahn

Alert button

HILCodec: High Fidelity and Lightweight Neural Audio Codec

Add code
Bookmark button
Alert button
May 08, 2024
Sunghwan Ahn, Beom Jun Woo, Min Hyun Han, Chanyeong Moon, Nam Soo Kim

Viaarxiv icon

EM-Network: Oracle Guided Self-distillation for Sequence Learning

Add code
Bookmark button
Alert button
Jun 14, 2023
Ji Won Yoon, Sunghwan Ahn, Hyeonseung Lee, Minchan Kim, Seok Min Kim, Nam Soo Kim

Figure 1 for EM-Network: Oracle Guided Self-distillation for Sequence Learning
Figure 2 for EM-Network: Oracle Guided Self-distillation for Sequence Learning
Figure 3 for EM-Network: Oracle Guided Self-distillation for Sequence Learning
Figure 4 for EM-Network: Oracle Guided Self-distillation for Sequence Learning
Viaarxiv icon

Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition

Add code
Bookmark button
Alert button
Nov 28, 2022
Ji Won Yoon, Beom Jun Woo, Sunghwan Ahn, Hyeonseung Lee, Nam Soo Kim

Figure 1 for Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition
Figure 2 for Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition
Figure 3 for Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition
Figure 4 for Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition
Viaarxiv icon

Transfer Learning Framework for Low-Resource Text-to-Speech using a Large-Scale Unlabeled Speech Corpus

Add code
Bookmark button
Alert button
Mar 29, 2022
Minchan Kim, Myeonghun Jeong, Byoung Jin Choi, Sunghwan Ahn, Joun Yeop Lee, Nam Soo Kim

Figure 1 for Transfer Learning Framework for Low-Resource Text-to-Speech using a Large-Scale Unlabeled Speech Corpus
Figure 2 for Transfer Learning Framework for Low-Resource Text-to-Speech using a Large-Scale Unlabeled Speech Corpus
Figure 3 for Transfer Learning Framework for Low-Resource Text-to-Speech using a Large-Scale Unlabeled Speech Corpus
Viaarxiv icon

Oracle Teacher: Towards Better Knowledge Distillation

Add code
Bookmark button
Alert button
Nov 05, 2021
Ji Won Yoon, Hyung Yong Kim, Hyeonseung Lee, Sunghwan Ahn, Nam Soo Kim

Figure 1 for Oracle Teacher: Towards Better Knowledge Distillation
Figure 2 for Oracle Teacher: Towards Better Knowledge Distillation
Figure 3 for Oracle Teacher: Towards Better Knowledge Distillation
Figure 4 for Oracle Teacher: Towards Better Knowledge Distillation
Viaarxiv icon