hadoop yarn:beyond virtual memory limits
2018-03-01 15:01
501 查看
问题描述:
最近使用sqoop导入数据到hdfs上面,出现不成功的情况,错误如下:Container [pid=25705,containerID=container_1519886213201_0001_01_000047] is running beyond virtual memory limits. Current usage: 163.8 MB of 1 GB physical memory used; 2.6 GB of 2.1 GB virtual memory used. Killing container.
Container [pid=24331,containerID=container_1519886213201_0001_01_000046] is running beyond virtual memory limits. Current usage: 207.9 MB of 1 GB physical memory used; 2.6 GB of 2.1 GB virtual memory used. Killing container.问题解决:
上网一查,发现有两种方法能解决这个问题:
1.将yarn.nodemanager.vmem-check-enabled的值改为false,即不检查VM的值;
2.将yarn.scheduler.minimum-allocation-mb的值调高一些,默认是1024mb,或者修改yarn.nodemanager.vmem-pmem-ratio的值,默认为2.1,将该值改得更大。
于是,我采用了第一种方法,重启yarn服务,再次提交,果然好了。
最近使用sqoop导入数据到hdfs上面,出现不成功的情况,错误如下:Container [pid=25705,containerID=container_1519886213201_0001_01_000047] is running beyond virtual memory limits. Current usage: 163.8 MB of 1 GB physical memory used; 2.6 GB of 2.1 GB virtual memory used. Killing container.
Container [pid=24331,containerID=container_1519886213201_0001_01_000046] is running beyond virtual memory limits. Current usage: 207.9 MB of 1 GB physical memory used; 2.6 GB of 2.1 GB virtual memory used. Killing container.问题解决:
上网一查,发现有两种方法能解决这个问题:
1.将yarn.nodemanager.vmem-check-enabled的值改为false,即不检查VM的值;
2.将yarn.scheduler.minimum-allocation-mb的值调高一些,默认是1024mb,或者修改yarn.nodemanager.vmem-pmem-ratio的值,默认为2.1,将该值改得更大。
于是,我采用了第一种方法,重启yarn服务,再次提交,果然好了。
相关文章推荐
- yarn Container beyond physical/virtual memory limits
- [hadoop] - Container [xxxx] is running beyond physical/virtual memory limits.
- yarn is running beyond physical memory limits 问题解决
- Container [pid=6263,containerID=container_1494900155967_0001_02_000001] is running beyond virtual memory limits
- Container is running beyond virtual memory limits. Current usage: 611.1 MB of 1 GB physical memory u
- 【hadoop】 running beyond virtual memory错误原因及解决办法
- spark on yarn:Container is running beyond physical memory limits
- running beyond virtual/physical memory limits问题解决
- hadoop job报错is running beyond physical memory limits
- 解决running beyond virtual memory limits. Current usage: 35.5 MB of 1 GB physical memory used; 16.8 G
- beyond virtual memory limits. Current usage: 142.3 MB of 1 GB physical memory used;
- hadoop virtual memory limits
- Yarn运行Job报错:container is running beyond virtual memory limits
- is running beyond virtual memory limits
- hadoop yarn running beyond physical memory used
- is running beyond physical memory limits. Current usage: 2.0 GB of 2 GB physical memory used; 2.6 GB of 40 GB virtual memory used
- hadoop集群跑mr程序报错beyond physical memory limits
- yarn is running beyond physical memory limits 问题解决
- 【hadoop】 running beyond virtual memory错误原因及解决办法
- hive: insert数据时Error during job, obtaining debugging information 以及beyond physical memory limits