AdaKD: Dynamic Knowledge Distillation of ASR models using Adaptive Loss Weighting https://arxiv.org/abs/2405.08019 #cs.LG #cs.AI
QOTO: Question Others to Teach Ourselves An inclusive, Academic Freedom, instance All cultures welcome. Hate speech and harassment strictly forbidden.