###《Machine Learning》by Andrew NG
2014-11-28 17:01
113 查看
点击查看Evernote原文。
#@author: gr #@date: 2014-10-17 #@email: forgerui@gmail.com
Fundamental
一、 矩阵的迹、秩
矩阵的秩: A的线性无关的极大数目,化简后他的非零项行数矩阵的迹: 矩阵主对角线上的元素的和。
# 矩阵的迹 trAB = trBA
二、非参数方法
非参数方法是数理统计学的一个分支,一般认为在一个统计推断问题中,如给定或者假定了总体分布的具体形式,只是其中含有若干个参数,要基于来自总体的样本对这些参数做出估计或者进行某种形式的假设检验,这类推断方法称为非参数方法。三、最小二乘法
最小二乘法(又称最小平方法)是一种数学优化技术。它通过最小化误差的平方和寻找数据的最佳函数匹配。利用最小二乘法可以简便地求得未知的数据,并使得这些求得的数据与实际数据之间误差的平方和为最小。最小二乘法还可用于曲线拟合。其他一些优化问题也可通过最小化能量或最大化熵用最小二乘法来表达。四、中心极限定理
中心极限定理(central limit theorem)是概率论中讨论随机变量序列部分和分布渐近于正态分布的一类定理。这组定理是数理统计学和误差分析的理论基础,指出了大量随机变量积累分布函数逐点收敛到正态分布的积累分布函数的条件。五、独立同分布(iid)
在概率统计理论中,如果变量序列或者其他随机变量有相同的概率分布,并且互相独立,那么这些随机变量是独立同分布。(independent and identically distributed )Content
一、机器学习的动机与应用
各种应用。二、梯度下降
Batch Gradient DescentStochastic Gradient Descent
三、欠拟合与过拟合的概念
Linear regresion --> Locally weighted regression|--> Probabilistic interpretation
|--> Logistic regression
|--> Perception
|--> Neuton's method
四、牛顿方法
Logistic regression|--> Newton's method
|--> Exponential Family
|--> Generalized Linear Models(GLMs)
五、生成学习方法
Generative learning algorithms|--> GDA(高斯判别分析)Gaussian Discrimitive Analysis
|--> Gaussian Distribution
|--> Generative & Discriminative comparision
|--> Naive Bayes
|--> Laplace Smoothing
六、朴素贝叶斯算法
Naive Bayes|--> Event Models
|--> Nerual Networks
|--> Support vector machines
相关文章推荐
- Machine Learning by Andrew Ng --- neural network learning
- Machine Learning —— By Andrew Ng(机器学习 听后自己做的笔记 记录重点内容)
- Notes - Coursera MachineLearning by Andrew NG - Week1
- Deep Learning by Andrew Ng --- Sparse Autoencoder
- (原创)Stanford Machine Learning (by Andrew NG) --- (week 8) Clustering & Dimensionality Reduction
- Machine Learning by Andrew Ng --- Support Vector Machine
- Deep Learning by Andrew Ng --- PCA and whitening
- (原创)Stanford Machine Learning (by Andrew NG) --- (week 10) Large Scale Machine Learning & Application Example
- Machine Learning by Andrew Ng --- K-means
- Deep Learning by Andrew Ng --- Sparse coding
- (原创)Stanford Machine Learning (by Andrew NG) --- (week 1) Introduction
- Stanford Machine Learning (by Andrew NG) --- (week 9) Anomaly Detection&Recommende
- Machine Learing by Andrew Ng --- PCA
- Machine Learning by Andrew Ng ---Linear Regression with one variable
- Machine Learning by Andrew Ng---Linear Regression with multiple variables
- (原创)Stanford Machine Learning (by Andrew NG) --- (week 3) Logistic Regression & Regularization
- Machine Learning by Andrew Ng --- Logistic Regression with two classes
- Deep Learning by Andrew Ng --- stacked autoencoder
- (原创)Stanford Machine Learning (by Andrew NG) --- (week 5) Neural Networks Learning
- Machine Learning by Andrew Ng --- Logistic Regression by using Regularization