java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
2016-08-30 09:11
507 查看
用sbt打包Spark程序,并未将所有依赖都打入包中,把Spark应用放到集群中运行时,出现异常:
Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at SparkHbase.main(SparkHbase.scala:34)atSparkHbase.main(SparkHbase.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit.orgapachesparkdeploySparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
… 11 more
出现该异常的原因是Spark应用缺少hbase依赖,我这里的做法是在集群的spark/conf/spark-env.sh中添加下文:
export SPARK_CLASSPATH=/home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar
切记注意每个jar包之间用冒号分隔!然后执行命令:
source spark-env.sh
并重启一下spark服务,就ok了!
其实还有一个方法,就是在你提交应用时增加–driver-class-path配置参数来设置driver的classpath:
./spark-submit –driver-class-path /home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar –class com.dtxy.data.SqlTest ../lib/bigdata-1.0-SNAPSHOT.jar
注:不能同时在spark/conf/spark-env.sh里面配置SPARK_CLASSPATH又在提交作业加上–driver-class-path参数,否则会出现异常:
15/08/14 09:22:23 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConfanonfun$validateSettings$6anonfunapply8.apply(SparkConf.scala:444)
at org.apache.spark.SparkConfanonfun$validateSettings$6anonfunapply8.apply(SparkConf.scala:442)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkConfanonfun$validateSettings$6.apply(SparkConf.scala:442)atorg.apache.spark.SparkConfanonfunvalidateSettings6.apply(SparkConf.scala:430)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)
at org.apache.spark.SparkContext.(SparkContext.scala:365)
at com.dtxy.data.SqlTest.main(SqlTest.scala:27)atcom.dtxy.data.SqlTest.main(SqlTest.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit.orgapachesparkdeploySparkSubmitrunMain(SparkSubmit.scala:664)atorg.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)atorg.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)atorg.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)atorg.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)15/08/1409:22:23INFOSparkContext:SuccessfullystoppedSparkContextExceptioninthread“main”org.apache.spark.SparkException:Foundbothspark.driver.extraClassPathandSPARKCLASSPATH.Useonlytheformer.atorg.apache.spark.SparkConfanonfunvalidateSettings6anonfun$apply$8.apply(SparkConf.scala:444)atorg.apache.spark.SparkConfanonfunvalidateSettings6anonfun$apply$8.apply(SparkConf.scala:442)atscala.collection.immutable.List.foreach(List.scala:318)atorg.apache.spark.SparkConfanonfunvalidateSettings6.apply(SparkConf.scala:442)
at org.apache.spark.SparkConfanonfun$validateSettings$6.apply(SparkConf.scala:430)atscala.Option.foreach(Option.scala:236)atorg.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)atorg.apache.spark.SparkContext.(SparkContext.scala:365)atcom.dtxy.data.SqlTest$.main(SqlTest.scala:27)atcom.dtxy.data.SqlTest.main(SqlTest.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmitrunMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit.doRunMain1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:192)atorg.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/14 09:22:23 INFO Utils: Shutdown hook called
到此为止,问题解决!
参考来源:http://www.abcn.net/2014/07/lighting-spark-with-hbase-full-edition.html
Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at SparkHbase.main(SparkHbase.scala:34)atSparkHbase.main(SparkHbase.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit.orgapachesparkdeploySparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
… 11 more
出现该异常的原因是Spark应用缺少hbase依赖,我这里的做法是在集群的spark/conf/spark-env.sh中添加下文:
export SPARK_CLASSPATH=/home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar
切记注意每个jar包之间用冒号分隔!然后执行命令:
source spark-env.sh
并重启一下spark服务,就ok了!
其实还有一个方法,就是在你提交应用时增加–driver-class-path配置参数来设置driver的classpath:
./spark-submit –driver-class-path /home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar –class com.dtxy.data.SqlTest ../lib/bigdata-1.0-SNAPSHOT.jar
注:不能同时在spark/conf/spark-env.sh里面配置SPARK_CLASSPATH又在提交作业加上–driver-class-path参数,否则会出现异常:
15/08/14 09:22:23 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConfanonfun$validateSettings$6anonfunapply8.apply(SparkConf.scala:444)
at org.apache.spark.SparkConfanonfun$validateSettings$6anonfunapply8.apply(SparkConf.scala:442)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkConfanonfun$validateSettings$6.apply(SparkConf.scala:442)atorg.apache.spark.SparkConfanonfunvalidateSettings6.apply(SparkConf.scala:430)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)
at org.apache.spark.SparkContext.(SparkContext.scala:365)
at com.dtxy.data.SqlTest.main(SqlTest.scala:27)atcom.dtxy.data.SqlTest.main(SqlTest.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit.orgapachesparkdeploySparkSubmitrunMain(SparkSubmit.scala:664)atorg.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)atorg.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)atorg.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)atorg.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)15/08/1409:22:23INFOSparkContext:SuccessfullystoppedSparkContextExceptioninthread“main”org.apache.spark.SparkException:Foundbothspark.driver.extraClassPathandSPARKCLASSPATH.Useonlytheformer.atorg.apache.spark.SparkConfanonfunvalidateSettings6anonfun$apply$8.apply(SparkConf.scala:444)atorg.apache.spark.SparkConfanonfunvalidateSettings6anonfun$apply$8.apply(SparkConf.scala:442)atscala.collection.immutable.List.foreach(List.scala:318)atorg.apache.spark.SparkConfanonfunvalidateSettings6.apply(SparkConf.scala:442)
at org.apache.spark.SparkConfanonfun$validateSettings$6.apply(SparkConf.scala:430)atscala.Option.foreach(Option.scala:236)atorg.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)atorg.apache.spark.SparkContext.(SparkContext.scala:365)atcom.dtxy.data.SqlTest$.main(SqlTest.scala:27)atcom.dtxy.data.SqlTest.main(SqlTest.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmitrunMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit.doRunMain1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:192)atorg.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/14 09:22:23 INFO Utils: Shutdown hook called
到此为止,问题解决!
参考来源:http://www.abcn.net/2014/07/lighting-spark-with-hbase-full-edition.html
相关文章推荐
- Cloudera集群中提交Spark任务出现java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.addFamily错误解决
- java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/h
- org.apache.hadoop.hbase.master.HMasterCommandLine: Master exiting java.lang.RuntimeException: HMaster Aborted
- 安装sqoop-1.99.7报caused by java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
- java.lang.ClassNotFoundException: org.apache.hadoop.hbase.mapreduce.TableOutputFormat解决办法
- Hadoop-2.5+Hbase-0.98报错:java.lang.RuntimeException: Failed construction of Master: class org.apache.
- Apache Kylin部署在CDH 5.4上报错Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescr
- nutch解决编译后java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfigura的问题
- java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HBaseSplit
- HBase MapReduce 解决java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/...
- 关于org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.IllegalAccessError
- Spark 读取Hbase 映射到Hive中的外部表报java.lang.NoSuchMethodError: org.apache.hadoop.hive.serde2.lazy.LazySim
- Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hado
- Spark java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32
- Error in metadata: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastor
- hive利用jdbc连接时报错:java.lang.ClassNotFoundException:org.apache.hadoop.hive.jdbc.HiveDriver
- hive报错 java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
- 解决Eclipse中运行WordCount出现 java.lang.ClassNotFoundException: org.apache.hadoop.examples.WordCount$Token
- Java 向Hbase表插入数据异常org.apache.hadoop.hbase.client.HTablePool$PooledHTable cannot be cast to org.apache.client.HTable
- FAILED: Error in metadata: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.