2.1 Monte Carlo Integration
2016-02-26 18:19
471 查看
The application of probabilistic models to data often leads to inference problems that require the integration of complex, high dimensional distributions. MCMC is a computational approach that replaces analytic integration. Intractable problems often become possible to solve using some form of MCMC. In this chapter, we will discuss two forms of MCMC: Metropolis-Hastings and Gibbs sampling.
Many problems in probabilistic inference require the calculation of complex integrals or summations over very large outcome spaces.
Calculate the expectation of a function $g(x)$ under $p(x)$ distribution: $E(g(x))=\int g(x)p(x)dx$ or $E(g(x))=\sum_{x} g(x)p(x)$. When $g(x)=x$, it calculates the mean of random variable $x$ under distribution $p(x)$. The general idea of Monte Carlo integration is to use samples to approximate the expectation of a complex distribution. That is:
(1) Draw $N$ independent samples $x^t,~t=1,2,3.....,N$ from $p(x)$.
(2) Approximate the expectation $E(g(x))=\int g(x)p(x)dx \approx \frac{\sum_{t=1}^{N}g(x^t)}{N}$.
It replace analytic integration with summation over a large set of samples. Generally, the accuracy of the approximation can be made as accurate as needed by increasing $N$.
Homework:
I. Approximate the mean of a random variable under $Beta(\alpha, \beta)$ distribution with $\alpha=3,~\beta=4$ using Monte Carlo.
Note that the analytic solution is $\frac{\alpha}{\alpha+\beta} \approx 0.4285$. Here $g(x)=x$. The code is:
II. Approximate the variance of a random variable under $Gamma(a,b)$ with $a=1.5,~b=4$ Using Monte Carlo.
Note that the analytic solution is $ab^2 \approx 24$. Here we first compute $E(x)$, then $DX=E(g(x))=\int g(x)p(x)dx=\int (x-E(x))^2p(x)dx \approx \frac{\sum_{t=1}^{N}g(x^t)}{N}$. The code is:
Many problems in probabilistic inference require the calculation of complex integrals or summations over very large outcome spaces.
Calculate the expectation of a function $g(x)$ under $p(x)$ distribution: $E(g(x))=\int g(x)p(x)dx$ or $E(g(x))=\sum_{x} g(x)p(x)$. When $g(x)=x$, it calculates the mean of random variable $x$ under distribution $p(x)$. The general idea of Monte Carlo integration is to use samples to approximate the expectation of a complex distribution. That is:
(1) Draw $N$ independent samples $x^t,~t=1,2,3.....,N$ from $p(x)$.
(2) Approximate the expectation $E(g(x))=\int g(x)p(x)dx \approx \frac{\sum_{t=1}^{N}g(x^t)}{N}$.
It replace analytic integration with summation over a large set of samples. Generally, the accuracy of the approximation can be made as accurate as needed by increasing $N$.
Homework:
I. Approximate the mean of a random variable under $Beta(\alpha, \beta)$ distribution with $\alpha=3,~\beta=4$ using Monte Carlo.
Note that the analytic solution is $\frac{\alpha}{\alpha+\beta} \approx 0.4285$. Here $g(x)=x$. The code is:
N=100000 sum(betarnd(3,4,1,N))/N
II. Approximate the variance of a random variable under $Gamma(a,b)$ with $a=1.5,~b=4$ Using Monte Carlo.
Note that the analytic solution is $ab^2 \approx 24$. Here we first compute $E(x)$, then $DX=E(g(x))=\int g(x)p(x)dx=\int (x-E(x))^2p(x)dx \approx \frac{\sum_{t=1}^{N}g(x^t)}{N}$. The code is:
N=100000; a=1.5; b=4; samples=gamrnd(a,b,1,N); Ex=sum(samples)/N;%Monte Carlo for E(x) Dx=(samples-Ex)*(samples-Ex)'/N%Monte Carlo for D(x)
相关文章推荐
- 负载均衡的三种参数设置
- 顺序右移数组元素(内测第0届第5题)
- 室内定位值得参考的网站
- MySQL架构体系
- ORMLite框架使用
- iOS 常用宏定义
- iOS 开发 -- Swift (十) 重载构造函数
- Android中数据库的操作流程详解
- DOMContentLoaded
- 第十三章编程练习(4)
- 总结一下最近的体会
- 【html5手游开发】虚拟摇杆及虚拟按键的开发
- Android,水波进度条
- Leet Code OJ 217. Contains Duplicate [Difficulty: Easy]
- linux系统创建SFTP用户及设置其chroot权限
- [设计模式] 装饰者模式Decorator
- Java开发中的23种设计模式详解(转)
- Spring相关
- MySQL优化
- 模型获取得值为<null>转为" "空字符串