HDFS上数据保存到Hbase运行报错:NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
2017-10-12 00:11
375 查看
把HDFS上数据保存到Hbase运行报错!!!!
错误如下:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at com.hadoop3.hbaseapi.day03.Demo01_HdfsToHbase.main(Demo01_HdfsToHbase.java:73)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
不加入hbase相关的代码运行mapreduce程序没有任何问题,一旦加入hbase相关的代码,报各种各样和hbase相关的NoClassDefFoundError
错误,原因为:mr程序没有引用到集群上的hbase jar包。
解决办法:
把hbase的jar包加入到hadoop classpath中
在hadoop安装目录下找到hadoop-env.sh文件,
添加 : export HADOOP_CLASSPATH=/home/hadoop/apps/hbase/lib/*
不需要重启,重新执行命令hadoop jar mapreducehbase.jar hbase.TxHBase 运行成功
错误如下:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at com.hadoop3.hbaseapi.day03.Demo01_HdfsToHbase.main(Demo01_HdfsToHbase.java:73)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
不加入hbase相关的代码运行mapreduce程序没有任何问题,一旦加入hbase相关的代码,报各种各样和hbase相关的NoClassDefFoundError
错误,原因为:mr程序没有引用到集群上的hbase jar包。
解决办法:
把hbase的jar包加入到hadoop classpath中
在hadoop安装目录下找到hadoop-env.sh文件,
添加 : export HADOOP_CLASSPATH=/home/hadoop/apps/hbase/lib/*
不需要重启,重新执行命令hadoop jar mapreducehbase.jar hbase.TxHBase 运行成功
相关文章推荐
- hadoop在put数据时,出现org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException 分析
- hbase数据无法导入问题(org.apache.hadoop.hbase.NotServingRegionException: Region is not online)
- hbase 数据export/import (No enum constant org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS)
- Mysql数据导入到Hbase报错:org.apache.hadoop.hbase.HTableDescriptor.addFamily
- org.apache.hadoop.hbase.mapreduce.Driver 导入数据到HBASE table
- hbase 数据export/import (No enum constant org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS)
- java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
- hadoop-0.20.2 & hbase-0.90.1 集群启动错误“org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol org.apache.hadoop.hdfs.protocol.ClientP
- hadoop-0.20.2 & hbase-0.90.1 集群启动错误“org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol org.apache.hadoop.hdfs.protocol.ClientP
- Java 向Hbase表插入数据异常org.apache.hadoop.hbase.client.HTablePool$PooledHTable cannot be cast to org.apache.client.HTable
- hadoop+hbase导致报错(NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration)
- spark插入数据到hbase: org.apache.hadoop.conf.Configuration not Serializable Exeception
- ERROR: org.apache.hadoop.hbase.PleaseHoldException: org.apache.hadoop.hbase.PleaseHoldException: Mas
- org.apache.hadoop.hdfs.protocol.FSLimitException$PathComponentTooLongException
- hbase(ERROR: org.apache.hadoop.hbase.ipc.ServerNotRunningYetException: Server is not running yet)
- hadoop源码分析系列(五)——org.apache.hadoop.hdfs包之balancer篇
- Hadoop Problem : org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Incompatible
- Java调用hdfs出现java.lang.VerifyError: class org.apache.hadoop.hdfs.protocol.proto异常
- HBase源码分析之org.apache.hadoop.hbase.ipc包