Networks with Linear Activation Function
2018-01-03 11:11
183 查看
import numpy as np import matplotlib.pyplot as plt %matplotlib inline
For Linear Separable Problem
#input X = np.array([[1,0,0], [1,0,1], [1,1,0], [1,1,1]]) #label Y = np.array([-1,1,1,1]) #weights vector W = np.random.random(3) #learning rate lr = 0.01 #single perceptron def update(): global X,Y,W,lr A = np.dot(X,W) W += lr*np.dot(Y-A,X) #run for _ in range(1000): update() #positive samples x1 = [0,1,1] y1 = [1,0,1] #negative samples x2 = [0] y2 = [0] xdata = np.linspace(-0.5,1.5) slope = -W[1]/W[2] intercept = -W[0]/W[2] plt.figure() plt.plot(xdata, slope*xdata+intercept, 'k-') plt.plot(x1,y1,'yo') plt.plot(x2,y2,'go') plt.show()
For Linear Inseparable Problem (XOR)
#input (introduce non-linear variables, such as x1^2, x1*x2, x2^2) X = np.array([[1,0,0,0,0,0], [1,0,1,0,0,1], [1,1,0,1,0,0], [1,1,1,1,1,1]]) #label Y = np.array([-1,1,1,-1]) #weights vector W = np.random.random(6) #learning rate lr = 0.01 #single perceptron def update(): global X,Y,W,lr A = np.dot(X,W) W += lr*np.dot(Y-A,X) #run for _ in range(1000): update() #positive samples x1 = [0,1] y1 = [1,0] #negative samples x2 = [0,1] y2 = [0,1] xdata = np.linspace(-0.5,1.5) def get_root(W,x): a = W[5] b = W[2]+W[4]*x c = W[0]+W[1]*x+W[3]*x*x return ((-b+np.sqrt(b*b-4*a*c))/(2*a),(-b-np.sqrt(b*b-4*a*c))/(2*a)) plt.figure() plt.plot(xdata, get_root(W,xdata)[0], 'k-') plt.plot(xdata, get_root(W,xdata)[1], 'k-') plt.plot(x1,y1,'yo') plt.plot(x2,y2,'go') plt.show()
相关文章推荐
- VGG Convolutional Neural Networks Practical(2)non-linear activation functions
- Linear regression with one variable算法实例讲解(绘制图像,cost_Function ,Gradient Desent, 拟合曲线, 轮廓图绘制)_矩阵操作
- Neural Activation Constellations: Unsupervised Part Model Discovery with Convolutional Networks
- Stanford机器学习---第二讲. 多变量线性回归 Linear Regression with multiple variable
- TensorFlow学习笔记7----Large-scale Linear Models with TensorFlow
- Joomla install SQL file (DB function failed with error number 1064)
- Here’s just a fraction of what you can do with linear algebra
- RFC 6719中文版: The Minimum Rank with Hysteresis Objective Function
- Tip/Trick: Implement Copy Function with FormView and SqlDatasource
- #Paper Reading# Abstractive Sentence Summarization with Attentive Recurrent Neural Networks
- (六)6.16 Neurons Networks linear decoders and its implements
- [Paper note] FlowNet: Learning Optical Flow with Convolutional Networks
- 论文阅读笔记:Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
- Stanford机器学习-Linear Regressioon with One Variable(1)
- 基于线性回归与核函数——linear function & basis function 的代码,处理马拉松数据
- ImageNet Classification with Deep Convolutional Neural Networks
- AndroidLinearLayout:添加阴影边界LinearLayout - Android LinearLayout : Add border with shadow around a line
- Machine Learning:Linear Regression With One Variable
- 第二讲.Linear Regression with multiple variable (多变量线性回归)
- Lua中table.sort() 报错 attempt to compare number with nil和invalid order function for sorting