hadoop native
2015-08-19 11:03
183 查看
http://blog.csdn.net/benben85/article/details/4161134
http://stackoverflow.com/questions/19943766/hadoop-unable-to-load-native-hadoop-library-for-your-platform-error-on-centos
up vote 2 down vote
The answer depends... I just installed Hadoop 2.6 from tarball on 64-bit CentOS 6.6. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:
And I know it is 64-bit:
Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focuses on, "Is this library 32 pr 64 bit?":
So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):
So, off to here to see what it does:
http://grepcode.com/file/repo1.maven.org/maven2/com.ning/metrics.action/0.2.6/org/apache/hadoop/util/NativeCodeLoader.java/
Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:
Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:
And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:
What version of GLIBC do I have? Here's simple trick to find out:
So, can't update my OS to 2.14. Only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) buy using the same logging options we used to get the debug message, except now, just make it ERROR level.
I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.
http://stackoverflow.com/questions/19943766/hadoop-unable-to-load-native-hadoop-library-for-your-platform-error-on-centos
mvn package -Dmaven.javadoc.skip=true -Pdist,native -DskipTests -Dtar
up vote 2 down vote
The answer depends... I just installed Hadoop 2.6 from tarball on 64-bit CentOS 6.6. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:
/opt/hadoop/lib/native/libhadoop.so.1.0.0
And I know it is 64-bit:
[hadoop@VMWHADTEST01 native]$ ldd libhadoop.so.1.0.0 ./libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0) linux-vdso.so.1 => (0x00007fff43510000) libdl.so.2 => /lib64/libdl.so.2 (0x00007f9be553a000) libc.so.6 => /lib64/libc.so.6 (0x00007f9be51a5000) /lib64/ld-linux-x86-64.so.2 (0x00007f9be5966000)
Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focuses on, "Is this library 32 pr 64 bit?":
`GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):
15/06/18 18:59:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
So, off to here to see what it does:
http://grepcode.com/file/repo1.maven.org/maven2/com.ning/metrics.action/0.2.6/org/apache/hadoop/util/NativeCodeLoader.java/
Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG
Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:
15/06/18 19:05:19 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /opt/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /opt/hadoop/lib/native/libhadoop.so.1.0.0)
And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:
`GLIBC_2.14' not found (required by opt/hadoop/lib/native/libhadoop.so.1.0.0)
What version of GLIBC do I have? Here's simple trick to find out:
[hadoop@VMWHADTEST01 hadoop]$ ldd --version ldd (GNU libc) 2.12
So, can't update my OS to 2.14. Only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) buy using the same logging options we used to get the debug message, except now, just make it ERROR level.
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.
shareimprove this answer | answered Jun 18 at 23:45 chromeeagle |
相关文章推荐
- 网站运维如何监控云主机服务
- Hadoop自动化集群部署脚本
- 关于OpenGL ES中的纹理压缩
- 深刻理解Linux进程间通信(IPC)
- myeclipse +tomcat +..
- doGet和doPost不同使用以及安卓乱码问题解决案例总结
- Linux常用命令/知识
- for in语句中hasOwnProperty过滤原型属性
- ps aux命令显示的状态列中的Ss+,Rsl,R+,S<sl含义
- Linux--基础
- visualvm监控jvm及远程jvm监控方法
- openfire 域名问题
- Linux 信号列表
- linux配置java环境变量(详细)
- 8月第2周全球域名商(国际域名)新增注册量TOP15
- Linux 线程浅析
- 我的openwrt学习笔记(九):开发板访问虚拟机 ubuntu linux 文件方法
- linux基础之重要文件(四)
- linux底层的短延迟操作(下) -- ndelay,udelay,mdelay以及msleep,ssleep,msleep_interruptible
- 7月共处理钓鱼网站3062个:支付交易类占57.97%