Zhou H, Zheng X, Wang Y, et al, GhostRNN: Reducing State Redundancy in RNN with Cheap Operations, INTERSPEECH 2023. (顶级国际会议)

Abstract: Recurrent neural networks (RNNs) that are capable of capturing long-range dependencies are widely utilized in numerous speech-related tasks, such as keyword spotting (KWS) and speech enhancement (SE). Given the constraints on power and memory in low-resource devices, there is an urgent need for efficient RNN models suitable for practical applications. This paper introduces an efficient RNN architecture known as GhostRNN, which mitigates hidden state redundancy through minimal computational operations. Specifically, it is observed that certain dimensions within the hidden states of trained RNN models are similar to others, indicating the presence of redundancy. To address this redundancy and consequently reduce computational expenses, the authors propose a method that initially generates a limited number of intrinsic states, followed by the application of inexpensive operations to derive ghost states from these intrinsic states. Experimental results from tasks in keyword spotting and speech enhancement demonstrate that the proposed GhostRNN architecture achieves significant reductions in memory usage (approximately 40%) and computational cost, all while maintaining comparable performance levels.

摘要:本文介绍了一种高效的递归神经网络(RNN)架构,名为GhostRNN,它通过简单的操作减少隐藏状态的冗余。研究者发现在训练好的RNN模型中,部分隐藏状态的维度与其他维度相似,表明特定RNN中存在冗余。为了减少这种冗余并降低计算成本,文章提出了首先生成少量内在状态,然后基于这些内在状态通过简单操作产生幽灵状态的方法。在关键词检测(KWS)和语音增强(SE)任务上的实验表明,GhostRNN在保持相似性能的同时,显著减少了内存使用(约40%)和计算成本。

首页_2023    论文-深度学习    Zhou H, Zheng X, Wang Y, et al, GhostRNN: Reducing State Redundancy in RNN with Cheap Operations, INTERSPEECH 2023. (顶级国际会议)
浏览量:0