Topic Model Gibbs Sampling Inference 步骤
2014-12-30 21:09
711 查看
1. difference between hidden variables and hyperparameter
2. procudre
step 1: the complete-data likelihood, given hyperparameter
p(w, z, theta, pi | alpha, beta)
step 2: the observed data
likelihood, given hidden variables
p(w | theta, pi)
step 3: determine which hidden variable can be integrated out, i.e. collapsed out.
theta, pi can be integrated out, thus the gibbs sampler is for p(z|w)
step 4: apply bayesian methods for
full conditional distribution p(z_i|, z_-i, w)
p(z_i| z_-i, w) = p(z,w)/{integrate z_i, p(z,w)}
step 5: based on the equation above, we need to calculate the
joint distribution of p(z,w)
2. procudre
step 1: the complete-data likelihood, given hyperparameter
p(w, z, theta, pi | alpha, beta)
step 2: the observed data
likelihood, given hidden variables
p(w | theta, pi)
step 3: determine which hidden variable can be integrated out, i.e. collapsed out.
theta, pi can be integrated out, thus the gibbs sampler is for p(z|w)
step 4: apply bayesian methods for
full conditional distribution p(z_i|, z_-i, w)
p(z_i| z_-i, w) = p(z,w)/{integrate z_i, p(z,w)}
step 5: based on the equation above, we need to calculate the
joint distribution of p(z,w)
相关文章推荐
- Correlated Topic model 的Gibbs sampling
- Topic model and Gibbs Sampling
- Gibbs Sampling for Ising model
- Gibbs Sampling for Gaussian Mixture Model
- Gibbs sampling [Gibbs采样]
- Gaussian mixture model. EM VS variational inference
- MCMC(Markov Chain Monte Carlo) and Gibbs Sampling
- LDA-math-MCMC 和 Gibbs Sampling
- 【译文】 The Author-Topic Model for Authors and Documents ( Michal Rosen-Zvi, Thomas Griffiths, etc. )
- #Paper Reading# Robust Word-Network Topic Model for Short Texts
- 话题模型(topic model)的提出及发展历史
- 随机采样方法整理与讲解(MCMC、Gibbs Sampling等)
- LDA Gibbs Sampling 的JAVA实现
- GibbsLDA model.cpp分析
- Topic Model的分类总结(LDA变种)
- 9-基于LDA的Topic Model变形
- Django基础之Model操作步骤(介绍)
- 随机采样方法整理与讲解(MCMC、Gibbs Sampling等)
- 话题模型(topic model)的提出及发展历史
- 马尔可夫链及吉布斯抽样 入门详解(Markov Chain Monte Carlo and Gibbs Sampling)