打开hive客户端出现错误create does not exist
2017-08-10 20:28
429 查看
出现这个错误的原因应该是你配置永久函数的
主目录中的.hiverc中内容的某个函数不能被创建
错误信息如下:
spark@spark03:~/app/apache-flume-1.6.0-bin$ hive
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
Logging initialized using configuration in jar:file:/home/spark/app/hive-0.12.0/lib/hive-common-0.12.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/spark/app/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/spark/app/hive-0.12.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/spark/app/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
create does not exist
Query returned non-zero code: 1, cause: create does not exist.
只要把你用户主目录中.hiverc的错我内容删除就行了,如果没有重要信息直接删除文件.hiverc
主目录中的.hiverc中内容的某个函数不能被创建
错误信息如下:
spark@spark03:~/app/apache-flume-1.6.0-bin$ hive
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
17/08/10 05:10:50 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
Logging initialized using configuration in jar:file:/home/spark/app/hive-0.12.0/lib/hive-common-0.12.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/spark/app/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/spark/app/hive-0.12.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/spark/app/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
create does not exist
Query returned non-zero code: 1, cause: create does not exist.
只要把你用户主目录中.hiverc的错我内容删除就行了,如果没有重要信息直接删除文件.hiverc
相关文章推荐
- maven项目 出现Specified web-root folder does not exist 的错误。
- django 1.8 出现 TemplateDoesNotExist 错误
- 使用qtcreator时出现The specified source space "/home/xxx/src" does not exist的错误
- Android studio 出现错误Androidmanifest xml does not exist or has incorrect root tag
- 关于 Database Mail 配置时出现的错误 - The configuration option 'Database Mail XPs' does not exist, or it may be an advanced option.
- Property OldCreateOrder does not exist 错误解决!
- android studio编程时出现的错误:Cannot get property 'XXXX' on extra properties extension as it does not exist
- 解决opensips启动时出现PID file /var/run/opensips.pid does not exist 的错误
- 解决opensips启动时出现PID file /var/run/opensips.pid does not exist 的错误
- Myeclipse 部署 maven项目 出现Specified web-root folder does not exist 的错误。
- django 1.8 出现 TemplateDoesNotExist 错误
- 解决opensips启动时出现PID file /var/run/opensips.pid does not exist 的错误 2014-05-14 16:04:11
- 关于使用CursorAdapter()时出现“column '_id' does not exist”错误的说明及解决方案
- django 1.8 出现 TemplateDoesNotExist 错误
- laravel搭建登陆注册模块时出现错误。Class app\Controllers\GP\LoginController does not exist
- CentOS 定制ISO后,安装时出现错误 /dev/root does not exist
- ant 编译时 出现package sun.org.mozilla.javascript.internal does not exist 错误
- WCF分布式开发常见错误(3):客户端调用服务出错:You have tried to create a channel to a service that does not support .Net Framing
- mysql错误代码: 1449 The user specified as a definer ('root'@'%') does not exist
- IDEA中Spring配置错误:class path resource [.xml] cannot be opened because it does not exist