您的位置:首页 > 移动开发

Where to apply dropout in recurrent neural networks for handwriting recognition?

2016-12-08 11:58 393 查看
Bluche T, Kermorvant C, Louradour J. Where to apply dropout in recurrent neural networks for handwriting recognition?[C]// 2015 13th International Conference
on Document Analysis and Recognition (ICDAR). IEEE Computer Society, 2015:681-685.

paper to read:Improving neural networks by preventing co-adaption of feature detectors

recurrent neural network regularization

investigation of recurrent neural networks architectures and learning methods for spoken language understanding

Long short-term memory

dropout用在lstm前效果优于中间,优于后面

各层全加dropout不如在最优的位置添加dropout

加入dropout后,权重更加sharper,

LSTM层之后的dropout可以促使不同unit发掘更多的信息,减少相邻unit之间的依赖

LSTM层之前的dropout可以减少相邻像素点之间的依赖
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐