【学习笔记】WEEK3_Shallow Neural Network_Random Initialization
2018-04-08 16:03
453 查看
1、权重参数向量b初始化为全0可以,然而权重参数矩阵w初始化为全0不可以
若权重参数矩阵w初始化为全0,则无论训练多久,同一隐藏层的所有单元都仍然在进行相同的运算,这样就没有多个隐藏单元存在的意义了
2、随机初始化
这里乘以0.01的原因是希望w小一点儿、从而使z小一点、从而使sigmoid或tanh激活函数在z处的梯度(斜率)大一些、从而使神经网络的学习速度快一些
如果神经网络中不使用sigmoid或tanh激活函数,不随机初始化也不是什么大问题
3、(无随堂测验)
若权重参数矩阵w初始化为全0,则无论训练多久,同一隐藏层的所有单元都仍然在进行相同的运算,这样就没有多个隐藏单元存在的意义了
2、随机初始化
这里乘以0.01的原因是希望w小一点儿、从而使z小一点、从而使sigmoid或tanh激活函数在z处的梯度(斜率)大一些、从而使神经网络的学习速度快一些
如果神经网络中不使用sigmoid或tanh激活函数,不随机初始化也不是什么大问题
3、(无随堂测验)
相关文章推荐
- 【学习笔记】WEEK3_Shallow Neural Network_Why do you need non-linear activation functions?
- 【学习笔记】WEEK3_Shallow Neural Network_Vectorizing across multiple examples
- 【学习笔记】WEEK3_Shallow Neural Network_Derivatives of activation functions
- 【学习笔记】WEEK3_Shallow Neural Network_Explanation for Vectorized Implementation
- 【学习笔记】WEEK3_Shallow Neural Network_Neural Networks Overview
- 【学习笔记】WEEK3_Shallow Neural Network_Gradient descent for Neural Networks
- 【学习笔记】WEEK3_Shallow Neural Network_Neural Network Representation
- 【学习笔记】WEEK3_Shallow Neural Network_Backpropagation intuition (optional)
- 【学习笔记】WEEK3_Shallow Neural Network_Activation functions
- 【学习笔记】WEEK3_Shallow Neural Network_Computing a Neural Network's Output
- 【学习笔记】WEEK3_Practice Questions_Quiz_Shallow Neural Networks
- 【学习笔记】WEEK2_Practice Questions_Quiz_Neural Network Basics
- 【学习笔记】WEEK 1_Introduction to Deep Learning_What is a neural network?
- Coursera deeplearning.ai 深度学习笔记1-2-Neural Network Basics-逻辑回归原理推导与代码实现
- 台湾大学深度学习课程 学习笔记 lecture1-2 Neural Network Basics
- 深度学习论文笔记--Recover Canonical-View Faces in the Wild with Deep Neural Network
- 【学习笔记】WEEK 1_Introduction to Deep Learning_Supervised Learning with Neural Networks
- 李宏毅机器学习课程笔记9:Recurrent Neural Network
- [coursera/dl&nn/week3]Shallow Neural Network(summary&question)
- 学习笔记5 Supervised Convolutional Neural Network 之 Stochastic Gradient Descent