Artificial Neural Network Notes
2015-01-22 12:12
323 查看
1.Selection of Activation function:
Why sigmoid function but not threshold function:
Sigmoid function is continuous and deferentiable, whereas threshold function is not.
Why sigmoid function but not threshold function:
Sigmoid function is continuous and deferentiable, whereas threshold function is not.
相关文章推荐
- Study Notes of Neural Network Optimization(1)
- Study Notes of Neural Network Optimization(1)
- wiki : Artificial neural network
- Study Notes of Neural Network Optimization(1)
- Study Notes of Neural Network Optimization(1)
- An Artificial Neural Network-based Stock Trading Sysytem Using Technical Analysis and Big Data Frame
- Study Notes of Neural Network Optimization(1)
- 人工神经网络(ANN, artificial neural network)
- Fast Artificial Neural Network Library (FANN)
- 论文阅读 | An Artificial Neural Network-based Stock Trading System Using Technical Analysis and Big Data
- FANN( Fast Artificial Neural Network Library)学习记录
- Study Notes of Neural Network Optimization(1)
- deeplearning_LogisticRegressionwithaNeuralNetworkmindset
- Discretized Continuous Speech Emotion Recognition with Multi-Task Deep Recurrent Neural Network
- Recurrent Neural Network[Quasi RNN]
- A Neural Network in 11 lines of Python (Part 1) DNN编程入门 特详细 很有深度
- deeplearning.ai-lecture1-building deep neural network-summary
- Speeding up your Neural Network with Theano and the GPU
- [论文笔记]TextBoxes A Fast Text Detector with a Single Deep Neural Network
- 吴恩达 深度学习 1-4 课后作业2 Deep Neural Network for Image Classification: Application