因子分析(factor analyis)
2014-12-24 14:35
134 查看
转自:http://www.cnblogs.com/jerrylead/archive/2011/05/11/2043317.html
1 问题
之前我们考虑的训练数据中样例![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/a044c6975a58cbbc7926a503e2f8b1e9.png)
的个数m都远远大于其特征个数n,这样不管是进行回归、聚类等都没有太大的问题。然而当训练样例个数m太小,甚至m<<n的时候,使用梯度下降法进行回归时,如果初值不同,得到的参数结果会有很大偏差(因为方程数小于参数个数)。另外,如果使用多元高斯分布(Multivariate
Gaussian distribution)对数据进行拟合时,也会有问题。让我们来演算一下,看看会有什么问题:
多元高斯分布的参数估计公式如下:
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/74b24a8d2f4b712517b581f8dde4bb5e.png)
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/efb7bb8217d2456e074704572a537a41.png)
分别是求mean和协方差的公式,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/0ba81238e8ed161d1b81b0ba12b4c7ee.png)
表示样例,共有m个,每个样例n个特征,因此
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/cb44a4ad55c84691a30f6ba7a9269c18.png)
是n维向量,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/861b57bb2a19d3fee0aa33ad2d8496e2.png)
是n*n协方差矩阵。
当m<<n时,我们会发现
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/5a554e41ba93c65d850f721ff1a4e503.png)
是奇异阵(
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/78ad80046c7de48554745ff0c7002d70.png)
),也就是说
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/124d373b05025ca2715551fb5541ed95.png)
不存在,没办法拟合出多元高斯分布了,确切的说是我们估计不出来
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/ef8b838c5b547a244cc1ee8cdf5302cf.png)
。
如果我们仍然想用多元高斯分布来估计样本,那怎么办呢?
2 限制协方差矩阵
当没有足够的数据去估计
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/d3c94d3aac86f2f4be82eedd847cf840.png)
时,那么只能对模型参数进行一定假设,之前我们想估计出完全的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/311cd7ac91f19bd6b64ec3a09f578308.png)
(矩阵中的全部元素),现在我们假设
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/4cddf045178cb0b301747be7b59e38f8.png)
就是对角阵(各特征间相互独立),那么我们只需要计算每个特征的方差即可,最后的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/b2b1e1798480e918b570271a2fce0317.png)
只有对角线上的元素不为0
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/532a37b7d0af3cf3cbc590820a86cf33.png)
回想我们之前讨论过的二维多元高斯分布的几何特性,在平面上的投影是个椭圆,中心点由
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/e549bb7c3ead01872b82b0e7cebdc5a0.png)
决定,椭圆的形状由
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/e81d18a4aeb0539cc7fbf8eebc6faae8.png)
决定。
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/f2b524b4780d156ef45394b525be55ae.png)
如果变成对角阵,就意味着椭圆的两个轴都和坐标轴平行了。
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/1d3bc8be0dfdd2719ca32645091ddd8c.jpg)
如果我们想对
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/8624dbf0c14cf6eae2b0434f39115fa7.png)
进一步限制的话,可以假设对角线上的元素都是等值的。
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/ea9b9eb21d1cad2d37ea11b96c8bc92e.png)
其中
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/7e46243c74a3ee955d8596ef45020471.png)
也就是上一步对角线上元素的均值,反映到二维高斯分布图上就是椭圆变成圆。
当我们要估计出完整的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/ce48552fc00654b939b1aaf6efb465b1.png)
时,我们需要m>=n+1才能保证在最大似然估计下得出的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/71164e16d9ad8c083542c36a903ea2b6.png)
是非奇异的。然而在上面的任何一种假设限定条件下,只要m>=2都可以估计出限定的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/3417a4f9b46db0ea043f5531eedaedf3.png)
。
这样做的缺点也是显然易见的,我们认为特征间独立,这个假设太强。接下来,我们给出一种称为因子分析的方法,使用更多的参数来分析特征间的关系,并且不需要计算一个完整的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/157cb2f4b16182dbbb52981a58830572.png)
。
3 边缘和条件高斯分布
在讨论因子分析之前,先看看多元高斯分布中,条件和边缘高斯分布的求法。这个在后面因子分析的EM推导中有用。
假设x是有两个随机向量组成(可以看作是将之前的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/34b4853e442771ec257038c76bd237b9.png)
分成了两部分)
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/82716da1f267f9af6a6acb7ebefe51ac.png)
其中
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/4e5ea23731cd2509323ab83e150c3f2c.png)
,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/e2c2ff259af6ffccb52ec29820500210.png)
,那么
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/af477a0fce41ddf4faa852a3768c1a2e.png)
。假设x服从多元高斯分布
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/4d9bcd748dc500bf1b4661738006b72f.png)
,其中
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/da7272c5c8f23c71ddbba79fb787fa70.png)
其中
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/42fa799655a758eab897362957be8c5e.png)
,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/5b70b0a660a77c11afccee2682aa6ea6.png)
,那么
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/e4bbed7ee1a1bf0c3f67ead12738ec0c.png)
,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/494f8587e9fc8c0eae311fa717c6af75.png)
,由于协方差矩阵是对称阵,因此
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/3e7ad2a233e3b87a56c11b400a6730cc.png)
。
整体看来
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/310ae7fc6ac2982f13c76a060cac2f52.png)
和
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/1cac9906c4e3e526311fd178bc4eb218.png)
联合分布符合多元高斯分布。
那么只知道联合分布的情况下,如何求得
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/ff8219d9ab5800efbf468db8f0d0766d.png)
的边缘分布呢?从上面的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/212534d3c79ebb87e051d489c11b9ce5.png)
和
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/0a01d23c81e61faaf06dda6827c84be1.png)
可以看出,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/189b74468689af6264b9f3d46747dd7c.png)
,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/4dcc62395d9daa08d089f95b7569f6f1.png)
,下面我们验证第二个结果
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/cdb33f27120ec77a149feb01583157f9.png)
由此可见,多元高斯分布的边缘分布仍然是多元高斯分布。也就是说
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/9b84ede99bf5891632e78897382f056e.png)
。
上面Cov(x)里面有趣的是
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/7824bc52947ae4f1f81a3a5f23370d6e.png)
,这个与之前计算协方差的效果不同。之前的协方差矩阵都是针对一个随机变量(多维向量)来说的,而
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/3daf5e7509a018e538a1befa3958977b.png)
评价的是两个随机向量之间的关系。比如
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/c584375655ab374e997066e9a21337e5.png)
={身高,体重},
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/d099babfa809b612422f0e5cd224ef90.png)
={性别,收入},那么
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/eaefe189dd439803c0282565472f2507.png)
求的是身高与身高,身高与体重,体重与体重的协方差。而
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/fbf8b29dd7305767a8dd2095e5433444.png)
求的是身高与性别,身高与收入,体重与性别,体重与收入的协方差,看起来与之前的大不一样,比较诡异的求法。
上面求的是边缘分布,让我们考虑一下条件分布的问题,也就是
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/6edeaafc2f20e705cc84eea50a25524d.png)
的问题。根据多元高斯分布的定义,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/aa38882478ad9480ba30b413246fc9c5.png)
。
且
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/9828669d46aad37664e7e2f540f2e72e.png)
这是我们接下来计算时需要的公式,这两个公式直接给出,没有推导过程。如果想了解具体的推导过程,可以参见Chuong B. Do写的《Gaussian processes》。
4 因子分析例子
下面通过一个简单例子,来引出因子分析背后的思想。
因子分析的实质是认为m个n维特征的训练样例
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/0dc15ea42ebd7101ce5a0db4854ade31.png)
的产生过程如下:
1、 首先在一个k维的空间中按照多元高斯分布生成m个
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/082b1c3828d86deac3df6574326bdc72.png)
(k维向量),即
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/702627c0d10efab3bde6f60f48322202.png)
2、 然后存在一个变换矩阵
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/a8123e1ece7344ab96c1ae6018af45de.png)
,将
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/afdf52cad4ac9b4e0ab2ebfea73ee081.png)
映射到n维空间中,即
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/f4a57ea665add822437dc934d6e3c9d8.png)
因为
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/8acc3d145c84a6ef6ae11e1c4ff2bda8.png)
的均值是0,映射后仍然是0。
3、 然后将
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/1ee9d2b9958a73aa98fc5056641b3de0.png)
加上一个均值
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/0c31acf03c1192143ce1fefbac04c9e7.png)
(n维),即
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/cbd7a9c4e81e92377066f2252a84c7bd.png)
对应的意义是将变换后的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/0016b761ece5ab471dfa1a1176d973e3.png)
(n维向量)移动到样本
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/fd504064a00788e3115904baea94647e.png)
的中心点
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/e666da9ff004b6afabe1a93da06e1f8a.png)
。
4、 由于真实样例
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/45d7a000d57590c4c181fd965912af18.png)
与上述模型生成的有误差,因此我们继续加上误差
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/bca034c2e269784a710242ca64ad46cf.png)
(n维向量),
而且
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/039f78f0d29c25c4ee0eceff7ed7749e.png)
符合多元高斯分布,即
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/de5a4117953718a034b68bc7994784af.png)
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/1d9695cbac4feb0f2b748b2252765d9c.png)
5、 最后的结果认为是真实的训练样例
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/5a11c0aadf3ff322ada7e9b4970fe2c2.png)
的生成公式
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/a2317ff98905a6868c51ec5e9a63f0d2.png)
让我们使用一种直观方法来解释上述过程:
假设我们有m=5个2维的样本点
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/9aabc146b1998112d78479a445d48dc3.png)
(两个特征),如下:
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/990fed8541a0a9661d1bdc87ab118805.png)
那么按照因子分析的理解,样本点的生成过程如下:
1、 我们首先认为在1维空间(这里k=1),存在着按正态分布生成的m个点
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/1d5e0a63f8dfc8da36f4661a3bf8e7df.png)
,如下
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/0bd86000d785efe51c101d5e5a00d359.png)
均值为0,方差为1。
2、 然后使用某个
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/1d1c83da55a8ed9018323490390670aa.png)
将一维的z映射到2维,图形表示如下:
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/2a946f0609381fc99953eb9c63921501.png)
3、 之后加上
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/7a6e940ac7ee662bcac70f62713c7fdc.png)
,即将所有点的横坐标移动
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/ae56a2a594f37c72f16a5ea1e78ccc62.png)
,纵坐标移动
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/8fc7ea9b87bf686f91550a14fc7b7435.png)
,将直线移到一个位置,使得直线过点
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/30458cb5549e19a0a1679123d54afc29.png)
,原始左边轴的原点现在为
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/2c134e7ff8fb0064e804c88b7ea80a37.png)
(红色点)。
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/dcc39997a039b0d2a094309ede659110.png)
然而,样本点不可能这么规则,在模型上会有一定偏差,因此我们需要将上步生成的点做一些扰动(误差),扰动
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/00f3d5ebf78de8c218a2bd36f0cd1db1.png)
。
4、 加入扰动后,我们得到黑色样本
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/eee638a2d659558870fce77a9fd5762d.png)
如下:
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/b4029553f79f5b60d9e5d30cff9f7fee.png)
5、 其中由于z和
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/69044bdf0c51af7e79f8f03a6de5159e.png)
的均值都为0,因此
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/50ca36a3479dd472d8ca0e5130d000cd.png)
也是原始样本点(黑色点)的均值。
由以上的直观分析,我们知道了因子分析其实就是认为高维样本点实际上是由低维样本点经过高斯分布、线性变换、误差扰动生成的,因此高维数据可以使用低维来表示。
5 因子分析模型
上面的过程是从隐含随机变量z经过变换和误差扰动来得到观测到的样本点。其中z被称为因子,是低维的。
我们将式子再列一遍如下:
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/e4a28909cc2ca9a710b45a5e3857c2cc.png)
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/721d5fa139e769cd500ff20ca8f64f73.png)
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/afa8cf19d72adbc59e9a3358019ecbcf.png)
其中误差
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/dbb7852e716c70a0d8223d5a2809e5d9.png)
和z是独立的。
下面使用的因子分析表示方法是矩阵表示法,在参考资料中给出了一些其他的表示方法,如果不明白矩阵表示法,可以参考其他资料。
矩阵表示法认为z和x联合符合多元高斯分布,如下
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/39b1324ac13d6058adf0cf97ba940334.png)
求
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/673a57d0f3594630b7196fab645963ca.png)
之前需要求E[x]
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/587a08f6467db426f2811bbb267b11fd.png)
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/c9af38fbc22c6910fc6b68df6e962dfc.png)
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/59e79b61e8e319c4db096dba62e56985.png)
我们已知E[z]=0,因此
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/157b76c5a6c4b0bbd1556b35dd0dc2ce.png)
下一步是计算
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/c7209e6061d7a30526a4e7687446d773.png)
,
其中
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/cb7385e1fcaf1e7eb40a5805db70555f.png)
接着求
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/315722049509edf0b262a87e67f027e7.png)
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/67b40575c0168f9eff44153178442188.png)
这个过程中利用了z和
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/aea690bb2725c32d09579075677dda51.png)
独立假设(
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/e8a2b8bc3a9fc3c0e4d28f0aac564de6.png)
)。并将
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/ea97186410822a7ed6d00c61bf897c32.png)
看作已知变量。
接着求
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/3ed40748d0f4afc612fec49629cd9eba.png)
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/f6ef5b9bcf5bddbae2f5fbed2d0362c8.png)
然后得出联合分布的最终形式
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/d1223d5c4bc5b59e68ae59103180100f.png)
从上式中可以看出x的边缘分布
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/dd93544f83657140b7b06cf133acf390.png)
那么对样本
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/a1c3c8e846312ebfe7b7dc4a37d151ad.png)
进行最大似然估计
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/0703315049a92305da1a712fd6fe155b.jpg)
然后对各个参数求偏导数不就得到各个参数的值了么?
可惜我们得不到closed-form。想想也是,如果能得到,还干嘛将z和x放在一起求联合分布呢。根据之前对参数估计的理解,在有隐含变量z时,我们可以考虑使用EM来进行估计。
6 因子分析的EM估计
我们先来明确一下各个参数,z是隐含变量,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/8e12852391b3c2976d1dc2ec2d5e0cc8.png)
是待估参数。
回想EM两个步骤:
循环重复直到收敛 { (E步)对于每一个i,计算 ![]() (M步)计算 ![]() |
(E步):
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/92a5e2b8c1bb2e454673dcd9e13a0dbb.png)
根据第3节的条件分布讨论,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/31a3daa9d0f22125bcf231b675e523ca.png)
因此
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/379f44e6224c744d1d2504b3ec07f435.png)
那么根据多元高斯分布公式,得到
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/bbbc3bf89d65ad238b52d4019a51b4fe.jpg)
(M步):
直接写要最大化的目标是
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/c07de0738a81597766d28fe1e83f68c7.png)
其中待估参数是
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/70ee4b0482c994d25143b4f912a4f34b.png)
下面我们重点求
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/1170dbcc0c9d7235d1aa54dcd300c082.png)
的估计公式
首先将上式简化为:
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/fd59614d843be68c60fd1edf8a1906be.jpg)
这里
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/6f2789a3eba826aa03559509ac381a5a.png)
表示
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/792b6a81d4d53e210ed5e0251f7d35c4.png)
服从
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/b14a0f14924a6849aa7678c0436f6de4.png)
分布。然后去掉与
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/49ca7af40ee20cd5793ac7089d30772d.png)
不相关的项(后两项),得
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/80a909c3a33e617df09e43ce535f9835.jpg)
去掉不相关的前两项后,对
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/a5b89bcfb183be48ef2ea4f13d035c93.png)
进行导,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/4cee8de6dede270f2a2fbe8360b3657f.png)
第一步到第二步利用了tr a = a(a是实数时)和tr AB = tr BA。最后一步利用了
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/67faf5bb1b246f80322dffc9c831e274.png)
tr就是求一个矩阵对角线上元素和。
最后让其值为0,并且化简得
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/7242188083665aef43ab2feb09df79bc.png)
然后得到
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/c2bfedae0e6669263ba21f1a86236b90.png)
到这里我们发现,这个公式有点眼熟,与之前回归中的最小二乘法矩阵形式类似
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/9b3e66d224a5736e6495b6bc21d02085.png)
这里解释一下两者的相似性,我们这里的x是z的线性函数(包含了一定的噪声)。在E步得到z的估计后,我们找寻的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/bc0f3bdaebfacc5ac9455aabe2440745.png)
实际上是x和z的线性关系。而最小二乘法也是去找特征和结果直接的线性关系。
到这还没完,我们需要求得括号里面的值
根据我们之前对z|x的定义,我们知道
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/6a1cffac6bdd0329f983941a29d9e7c3.png)
第一步根据z的条件分布得到,第二步根据
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/381382e6a27fcf491236c0d1035d47ca.png)
得到
将上面的结果代入(7)中得到
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/654d416e42f8a238530d36a423748189.png)
至此,我们得到了
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/c4daf3d048da16b5d2c903b2b05bf3e4.png)
,注意一点是E[z]和
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/b4e16b1b02a3ee019eef38f095618a20.png)
的不同,后者需要求z的协方差。
其他参数的迭代公式如下:
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/143a1500b8f6adb543c941f1cb831835.png)
均值
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/b065c6534b898e9ddac46d18b031fb21.png)
在迭代过程中值不变。
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/45cc458e65e7173f524e287ff9f4a0a0.jpg)
然后将
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/adf049e134a149c818aeb1c5d252cf77.png)
上的对角线上元素抽取出来放到对应的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/d0a94b89bf819a57af2ddd050ee80f7b.png)
中,就得到了
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/45fb6a661de5389b0614263e33616e40.png)
。
7 总结
根据上面的EM的过程,要对样本X进行因子分析,只需知道要分解的因子数(z的维度)即可。通过EM,我们能够得到转换矩阵
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/647ef93dfd7f9f30531a47a2cf568f6a.png)
和误差协方差
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/7ff51b3ff66f413d852262a1e2c19837.png)
。
因子分析实际上是降维,在得到各个参数后,可以求得z。但是z的各个参数含义需要自己去琢磨。
下面从一个ppt中摘抄几段话来进一步解释因子分析。
因子分析(factor analysis)是一种数据简化的技术。它通过研究众多变量之间的内部依赖关系,探求观测数据中的基本结构,并用少数几个假想变量来表示其基本的数据结构。这几个假想变量能够反映原来众多变量的主要信息。原始的变量是可观测的显在变量,而假想变量是不可观测的潜在变量,称为因子。
例如,在企业形象或品牌形象的研究中,消费者可以通过一个有24个指标构成的评价体系,评价百货商场的24个方面的优劣。
但消费者主要关心的是三个方面,即商店的环境、商店的服务和商品的价格。因子分析方法可以通过24个变量,找出反映商店环境、商店服务水平和商品价格的三个潜在的因子,对商店进行综合评价。而这三个公共因子可以表示为:
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/8c0223894617b1f61629ad24d5472397.jpg)
这里的
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/3cdee2c848af909e4c0014dd12351a47.png)
就是样例x的第i个分量,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/d7c42ddfc314736db97c1c4c11f99167.png)
就是
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/fedb0ffa795c4292efe33808f1da9873.png)
的第i个分量,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/b80168eba45e1615b5bd6c4ab30f90b8.png)
就是
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/f6a5e89e94e4268b1ccd4e89ba95c04e.png)
的第i行第j列元素,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/8db276ddea9af9e546335da68067a0f6.png)
是z的第i个分量,
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/5d4d5b96a1b8d929a684a7d0fdeede7a.png)
是
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/fc27689602f6e75ed269cd7c66b0e8cb.png)
。
称
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/10e51922ff6238654c2d6a13420c0c48.png)
是不可观测的潜在因子。24个变量共享这三个因子,但是每个变量又有自己的个性,不被包含的部分
![](https://oscdn.geek-share.com/Uploads/Images/Content/201909/24/8e19542f3841a9771486c92859fa2b63.png)
,称为特殊因子。
注:
因子分析与回归分析不同,因子分析中的因子是一个比较抽象的概念,而回归因子有非常明确的实际意义;
主成分分析分析与因子分析也有不同,主成分分析仅仅是变量变换,而因子分析需要构造因子模型。
主成分分析:原始变量的线性组合表示新的综合变量,即主成分;
因子分析:潜在的假想变量和随机影响变量的线性组合表示原始变量。
PPT地址
http://www.math.zju.edu.cn/webpagenew/uploadfiles/attachfiles/2008123195228555.ppt
其他值得参考的文献
An Introduction to Probabilistic Graphical Models by Jordan Chapter 14
主成分分析和因子分析的区别http://cos.name/old/view.php?tid=10&id=82
相关文章推荐
- 也谈 机器学习到底有没有用 ?
- 量子计算机编程原理简介 和 机器学习
- 10个关于人工智能和机器学习的有趣开源项目
- 机器学习书单
- 北美常用的机器学习/自然语言处理/语音处理经典书籍
- 如何提升COBOL系统代码分析效率
- 神经网络初步学习手记
- 开始spark之旅
- spark的几点备忘
- 关于机器学习的学习笔记(一):机器学习概念
- 关于机器学习的学习笔记(二):决策树算法
- 长期招聘:自然语言处理工程师
- 长期招聘:个性化推荐
- 为什么需要一个推荐引擎平台
- 机器学习之决策树整理
- Kernel PCA
- 机器学习之:特征向量选取
- 机器学习的数学基础 1. 共轭先验 Conjugate Prior
- 关于似然函数,后验函数的总结
- Boosting原理学习