您的位置:首页 > 运维架构

Deep Learning: Assuming a deep neural network is properly regulated, can adding more layers actually make the performance degrade?

2015-08-04 19:55 471 查看
Deep Learning: Assuming a deep neural network is properly regulated, can adding more layers actually make the performance degrade?

I found this to be really puzzling. A deeper NN is supposed to be more powerful or at least equal to a shallower NN. I have already used dropout to prevent overfitting. How can the performance be degraded?

Re-Ask
Follow17

Yoshua's Answer
View 2 Other Answers




Yoshua Bengio, My lab has been one of the three that started the deep learning approach, bac...
Upvoted by Prateek Tandon, Robotics and Strong Artificial Intelligence Researcher• Paul King, Computational Neuroscientist, Technology Entrepreneur • Jack Rae,Google DeepMind Research Engineer
Yoshua has 25 endorsements in Deep Learning.

If you do not change the size of the layers and just add more layers, capacity should increase, so you could be overfitting. However, you should check whether training error increases or decreases. If it increases (which is also very plausible), it means that adding the layer made the optimization harder, with the optimization methods and initialization that you are using. That could also explain your problem. However, if training error decreases and test error increases, you are overfitting.
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: