Anndrew Ng's Machine Learning in Coursera(II)
2015-08-24 21:09
330 查看
Logistic Regression
1.Classification/Hypothesis Representation
In the former class,we talked about liner regression,today we will go into a new field.First we have the tumor size and we need to predict whether it is malignant or benign.
We do the liner regression and get the line.W define a threshold=0.5. if hypothetic value>threshold=0.5 We say it's malignant. else we say it is benign.
But what if?
Apparently, the new liner regression is not a good classification.
So we have to do something! A new model appear!
The So-Called Sigmoid Function or Logistic Function is the same.
The functional image of g is at the bottom right.0<=g of z <=1.When (z>=0,g of z>0.5||| z<0, g of z<0.5);
From above picture,We know when we have the value x,θ,the
probability of y=0 +y=1 =1
2.decision boundary
what we say the dissection boundary is to classify the boundary of h of x based on all the data;As the following , when we have h(x)=g(θ0+θ1x1+θ2x2)
and the parameter θ=[-3,1,1]T,
predict
Y=1 if -3+x1+x2>=0
predict Y=0, if -3+x1+x2<0
Apart from liner boundary,we have some non-liner boundary.For example:
3.Cost Function
For a logistic Regression Model,we have,so
the cost function is:
Since y=0 or y=1,we can rewrite cost function into:
And we use Gradient Descent to minimize the cost function
So we have
Do you find that it is the same as what we get in the week2 liner regression?
4.advanced Optimization:
Besides Gradient descent,there are other algorithms such as Conjugate gradient,BFGS or L-BFGS.
相关文章推荐
- 用Python从零实现贝叶斯分类器的机器学习的教程
- 也谈 机器学习到底有没有用 ?
- 量子计算机编程原理简介 和 机器学习
- 10个关于人工智能和机器学习的有趣开源项目
- 机器学习实践中应避免的7种常见错误
- 机器学习书单
- 北美常用的机器学习/自然语言处理/语音处理经典书籍
- 如何提升COBOL系统代码分析效率
- 支持向量机(SVM)算法概述
- 神经网络初步学习手记
- 开始spark之旅
- spark的几点备忘
- 关于机器学习的学习笔记(一):机器学习概念
- 关于机器学习的学习笔记(二):决策树算法
- 关于机器学习的学习笔记(三):k近邻算法
- 长期招聘:自然语言处理工程师
- 长期招聘:个性化推荐
- 为什么需要一个推荐引擎平台
- 机器学习之决策树整理
- Kernel PCA