您的位置:首页 > 运维架构

dropout、dropconnect、maxout、batch normalization

2016-06-26 10:04 555 查看
This success of CNNs is partly due to the availability of large datasets and high-performance computing systems and partly due to the recent technical advances on learning methods and regularization techniques like dropout [27], dropconnect [31], maxout
[5] and batch normalization [8].

[27] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever,

and R. Salakhutdinov. Dropout: A simple way to prevent

neural networks from overfitting. The Journal of

Machine Learning Research, 15(1):1929–1958, 2014.

[31] L.Wan, M. Zeiler, S. Zhang, Y. L. Cun, and R. Fergus.

Regularization of neural networks using dropconnect.

In Proceedings of the 30th International Conference

on Machine Learning (ICML-13), pages 1058–1066,

2013. 1, 2

[5] I. J. Goodfellow, D. Warde-Farley, M. Mirza,

A. Courville, and Y. Bengio. Maxout networks. arXiv

preprint arXiv:1302.4389, 2013. 1, 2, 5, 6

[8] S. Ioffe and C. Szegedy. Batch normalization: Accelerating

deep network training by reducing internal covariate

shift. arXiv preprint arXiv:1502.03167, 2015.

1, 2
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: