【学习笔记】WEEK3_Shallow Neural Network_Computing a Neural Network's Output
2018-04-03 16:32
423 查看
1、一个神经元里进行两步运算
1)z =
2)a = sigmoid(z)
2、向量化
Z-[1] = W-[1].T * X + b-[1] #np.transpose()、np.dot(,)
3、2层神经网络的计算过程与各层矩阵的维数
4、(无随堂测验)
1)z =
2)a = sigmoid(z)
2、向量化
Z-[1] = W-[1].T * X + b-[1] #np.transpose()、np.dot(,)
3、2层神经网络的计算过程与各层矩阵的维数
4、(无随堂测验)
相关文章推荐
- 【学习笔记】WEEK3_Shallow Neural Network_Gradient descent for Neural Networks
- 【学习笔记】WEEK3_Shallow Neural Network_Neural Network Representation
- 【学习笔记】WEEK3_Shallow Neural Network_Backpropagation intuition (optional)
- 【学习笔记】WEEK3_Shallow Neural Network_Activation functions
- 【学习笔记】WEEK3_Shallow Neural Network_Why do you need non-linear activation functions?
- 【学习笔记】WEEK3_Shallow Neural Network_Random Initialization
- 【学习笔记】WEEK3_Shallow Neural Network_Vectorizing across multiple examples
- 【学习笔记】WEEK3_Shallow Neural Network_Derivatives of activation functions
- 【学习笔记】WEEK3_Shallow Neural Network_Explanation for Vectorized Implementation
- 【学习笔记】WEEK3_Shallow Neural Network_Neural Networks Overview
- 【学习笔记】WEEK 1_Introduction to Deep Learning_What is a neural network?
- 【学习笔记】WEEK3_Practice Questions_Quiz_Shallow Neural Networks
- 【学习笔记】WEEK2_Practice Questions_Quiz_Neural Network Basics
- 深度学习课程笔记(十五)Recurrent Neural Network
- 风格迁移学习笔记(1):Multimodal Transfer: A Hierarchical Deep Convolutional Neural Network for Fast
- 【学习笔记】WEEK 1_Introduction to Deep Learning_Supervised Learning with Neural Networks
- 【学习笔记】WEEK2_Python and Vectorization_Vectorizing Logistic Regression's Gradient Output
- 台大李宏毅Machine Learning 2017Fall学习笔记 (11)Convolutional Neural Network
- 台湾大学深度学习课程 学习笔记 lecture1-2 Neural Network Basics
- 【Deep Learning学习笔记】NEURAL NETWORK BASED LANGUAGE MODELS FOR HIGHLY INFLECTIVE LANGUAGES_google2009