本章到目前为止介绍的循环神经网络只有一个单向的隐藏层,在深度学习应用里,我们通常会用到含有多个隐藏层的循环神经网络,也称作深度循环神经网络。图6.11演示了一个有个隐藏层的深度循环神经网络,每个隐藏状态不断传递至当前层的下一时间步和当前时间步的下一层。
具体来说,在时间步里,设小批量输入(样本数为,输入个数为),第隐藏层()的隐藏状态为%7D%20%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bn%20%5Ctimes%20h%7D#card=math&code=%5Cboldsymbol%7BH%7D_t%5E%7B%28%5Cell%29%7D%20%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bn%20%5Ctimes%20h%7D)(隐藏单元个数为),输出层变量为(输出个数为),且隐藏层的激活函数为。第1隐藏层的隐藏状态和之前的计算一样:
%7D%20%3D%20%5Cphi(%5Cboldsymbol%7BX%7Dt%20%5Cboldsymbol%7BW%7D%7Bxh%7D%5E%7B(1)%7D%20%2B%20%5Cboldsymbol%7BH%7D%7Bt-1%7D%5E%7B(1)%7D%20%5Cboldsymbol%7BW%7D%7Bhh%7D%5E%7B(1)%7D%20%20%2B%20%5Cboldsymbol%7Bb%7Dh%5E%7B(1)%7D)%2C%0A#card=math&code=%5Cboldsymbol%7BH%7D_t%5E%7B%281%29%7D%20%3D%20%5Cphi%28%5Cboldsymbol%7BX%7D_t%20%5Cboldsymbol%7BW%7D%7Bxh%7D%5E%7B%281%29%7D%20%2B%20%5Cboldsymbol%7BH%7D%7Bt-1%7D%5E%7B%281%29%7D%20%5Cboldsymbol%7BW%7D%7Bhh%7D%5E%7B%281%29%7D%20%20%2B%20%5Cboldsymbol%7Bb%7D_h%5E%7B%281%29%7D%29%2C%0A)
其中权重%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bd%20%5Ctimes%20h%7D#card=math&code=%5Cboldsymbol%7BW%7D%7Bxh%7D%5E%7B%281%29%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bd%20%5Ctimes%20h%7D)、![](https://g.yuque.com/gr/latex?%5Cboldsymbol%7BW%7D%7Bhh%7D%5E%7B(1)%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bh%20%5Ctimes%20h%7D#card=math&code=%5Cboldsymbol%7BW%7D_%7Bhh%7D%5E%7B%281%29%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bh%20%5Ctimes%20h%7D)和偏差 %7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7B1%20%5Ctimes%20h%7D#card=math&code=%5Cboldsymbol%7Bb%7D_h%5E%7B%281%29%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7B1%20%5Ctimes%20h%7D)分别为第1隐藏层的模型参数。
当时,第隐藏层的隐藏状态的表达式为
%7D%20%3D%20%5Cphi(%5Cboldsymbol%7BH%7Dt%5E%7B(%5Cell-1)%7D%20%5Cboldsymbol%7BW%7D%7Bxh%7D%5E%7B(%5Cell)%7D%20%2B%20%5Cboldsymbol%7BH%7D%7Bt-1%7D%5E%7B(%5Cell)%7D%20%5Cboldsymbol%7BW%7D%7Bhh%7D%5E%7B(%5Cell)%7D%20%20%2B%20%5Cboldsymbol%7Bb%7Dh%5E%7B(%5Cell)%7D)%2C%0A#card=math&code=%5Cboldsymbol%7BH%7D_t%5E%7B%28%5Cell%29%7D%20%3D%20%5Cphi%28%5Cboldsymbol%7BH%7D_t%5E%7B%28%5Cell-1%29%7D%20%5Cboldsymbol%7BW%7D%7Bxh%7D%5E%7B%28%5Cell%29%7D%20%2B%20%5Cboldsymbol%7BH%7D%7Bt-1%7D%5E%7B%28%5Cell%29%7D%20%5Cboldsymbol%7BW%7D%7Bhh%7D%5E%7B%28%5Cell%29%7D%20%20%2B%20%5Cboldsymbol%7Bb%7D_h%5E%7B%28%5Cell%29%7D%29%2C%0A)
其中权重%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bh%20%5Ctimes%20h%7D#card=math&code=%5Cboldsymbol%7BW%7D%7Bxh%7D%5E%7B%28%5Cell%29%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bh%20%5Ctimes%20h%7D)、![](https://g.yuque.com/gr/latex?%5Cboldsymbol%7BW%7D%7Bhh%7D%5E%7B(%5Cell)%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bh%20%5Ctimes%20h%7D#card=math&code=%5Cboldsymbol%7BW%7D_%7Bhh%7D%5E%7B%28%5Cell%29%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7Bh%20%5Ctimes%20h%7D)和偏差 %7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7B1%20%5Ctimes%20h%7D#card=math&code=%5Cboldsymbol%7Bb%7D_h%5E%7B%28%5Cell%29%7D%20%5Cin%20%5Cmathbb%7BR%7D%5E%7B1%20%5Ctimes%20h%7D)分别为第隐藏层的模型参数。
最终,输出层的输出只需基于第隐藏层的隐藏状态:
%7D%20%5Cboldsymbol%7BW%7D%7Bhq%7D%20%2B%20%5Cboldsymbol%7Bb%7D_q%2C%0A#card=math&code=%5Cboldsymbol%7BO%7D_t%20%3D%20%5Cboldsymbol%7BH%7D_t%5E%7B%28L%29%7D%20%5Cboldsymbol%7BW%7D%7Bhq%7D%20%2B%20%5Cboldsymbol%7Bb%7D_q%2C%0A)
其中权重和偏差为输出层的模型参数。
同多层感知机一样,隐藏层个数和隐藏单元个数都是超参数。此外,如果将隐藏状态的计算换成门控循环单元或者长短期记忆的计算,我们可以得到深度门控循环神经网络。
小结
- 在深度循环神经网络中,隐藏状态的信息不断传递至当前层的下一时间步和当前时间步的下一层。
注:本节与原书基本相同,原书传送门