Hive安装配置详解
2015-10-04 08:54
609 查看
本文主要是在Hadoop单机模式中演示Hive默认(嵌入式derby 模式)安装配置过程,目录结构如下:
基础环境
Hive安装配置
启动及演示
[一]、基础环境
Mac OSX 10.9.1
Java 1.6+
Hadoop 2.2.0 (单机模式安装配置详见:http://www.micmiu.com/opensource/hadoop/hadoop2x-single-node-setup/ )
Hive 0.12.0 (截止2014-02-09最新的发布版本)
[二]、Hive安装配置
1、下载发布包
到官方下载最近发布包以 0.12.0为例:
本文中 HIVE_HOME = “/usr/local/share/”
2、设置环境变量
执行
3、配置文件
在目录
copy 生成四个配置文件然后既可自定义相关属性:
ps:注意文件名称:
不过官方0.12.0的发布版本中的 hive-default.xml.template 中有 bug,在 2000行:
有关 hive.metastore.schema.verification 版本检查的问题,有两个解决办法
方法一:修改配置文件
第一次运行前先将 hive.metastore.schema.verification 设为false
方法二:不改配置,先初始化好数据
执行初始化命令:
查看初始化后的信息:
详见:https://cwiki.apache.org/confluence/display/Hive/Hive+Schema+Tool
以上方法都可以,否则第一次运行时会类似如下的报错信息:
4、配置dfs中得目录和权限
[三]、运行和测试
确保HADOOP_HOME 在环境变量中配置好,然后以CLI(command line interface)方式下运行,直接执行命令
到此嵌入式derby 模式下的Hive安装配置已经成功。
参考:
https://cwiki.apache.org/confluence/display/Hive/Home https://cwiki.apache.org/confluence/display/Hive/GettingStarted
https://cwiki.apache.org/confluence/display/Hive/Hive+Schema+Tool https://cwiki.apache.org/confluence/display/Hive/AdminManual+MetastoreAdmin
—————– EOF @Michael Sun —————–
基础环境
Hive安装配置
启动及演示
[一]、基础环境
Mac OSX 10.9.1
Java 1.6+
Hadoop 2.2.0 (单机模式安装配置详见:http://www.micmiu.com/opensource/hadoop/hadoop2x-single-node-setup/ )
Hive 0.12.0 (截止2014-02-09最新的发布版本)
[二]、Hive安装配置
1、下载发布包
到官方下载最近发布包以 0.12.0为例:
1 | $ tar -zxf hive-0.12.0-bin. tar .gz -C /usr/ local /share |
2 | $ cd /usr/ local /share |
3 | $ ln -s hive-0.12.0-bin hive |
2、设置环境变量
执行
vi ~/.profile,添加如下内容:
1 | #Hive @micmiu.com |
2 | export HIVE_HOME= "/usr/local/share/hive" |
3 | export PATH=$HIVE_HOME/bin:$PATH |
在目录
<HIVE_HOME>/conf目录下有4个模板文件:
1 | hive-default.xml.template |
2 | hive- env .sh.template |
3 | hive- exec -log4j.properties.template |
4 | hive-log4j.properties.template |
1 | $ cd /usr/ local /share/hive/conf |
2 | $copy hive-default.xml.template hive-site.xml |
3 | $copy hive- env .sh.template hive- env .sh |
4 | $copy hive- exec -log4j.properties.template hive- exec -log4j.properties |
5 | $copy hive-log4j.properties.template hive-log4j.properties |
hive-site.xml,本文以嵌入式derby 模式做演示,故以默认配置即可无效修改相关参数。
不过官方0.12.0的发布版本中的 hive-default.xml.template 中有 bug,在 2000行:
<value>auth</auth>修改为:
<value>auth</value>
有关 hive.metastore.schema.verification 版本检查的问题,有两个解决办法
方法一:修改配置文件
第一次运行前先将 hive.metastore.schema.verification 设为false
1 | ...... |
2 | <!-- 设为false 不做验证--> |
3 | < name >hive.metastore.schema.verification</ name > |
4 | < value >false</ value > |
5 | ...... |
执行初始化命令:
schematool -dbType derby -initSchema
1 | micmiu-mbp:~ micmiu$schematool -dbType derby -initSchema |
2 | Metastore connection URL: jdbc:derby:;databaseName=metastore_db;create= true |
3 | Metastore Connection Driver : org.apache.derby.jdbc.EmbeddedDriver |
4 | Metastore connection User: APP |
5 | Starting metastore schema initialization to 0.12.0 |
6 | Initialization script hive-schema-0.12.0.derby.sql |
7 | Initialization script completed |
8 | schemaTool completeted |
schematool -dbType derby -info
1 | micmiu-mbp:~ micmiu$schematool -dbType derby -info |
2 | Metastore connection URL: jdbc:derby:;databaseName=metastore_db;create= true |
3 | Metastore Connection Driver : org.apache.derby.jdbc.EmbeddedDriver |
4 | Metastore connection User: APP |
5 | Hive distribution version: 0.12.0 |
6 | Metastore schema version: 0.12.0 |
7 | schemaTool completeted |
以上方法都可以,否则第一次运行时会类似如下的报错信息:
1 | ERROR exec .DDLTask (DDLTask.java:execute(435)) - org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient |
2 | at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1143) |
3 | ...... |
4 | Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient |
5 | at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1212) |
6 | ...... |
7 | Caused by: java.lang.reflect.InvocationTargetException |
8 | ...... |
9 | Caused by: MetaException(message:Version information not found in metastore. ) |
10 | at org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:5638) |
11 | at org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:5622) |
12 | ...... |
1 | $hdfs dfs - mkdir /tmp |
2 | $hdfs dfs - mkdir /user/hive/warehouse |
3 | $hdfs dfs - chmod g+w /tmp |
4 | $hdfs dfs - chmod g+w /user/hive/warehouse |
确保HADOOP_HOME 在环境变量中配置好,然后以CLI(command line interface)方式下运行,直接执行命令
hive即可,然后执行一些测试命令如下:
1 | hive> show databases; |
2 | OK |
3 | default |
4 | Time taken: 4.966 seconds, Fetched: 1 row(s) |
5 | hive> show tables; |
6 | OK |
7 | Time taken: 0.186 seconds |
8 | hive> CREATE TABLE micmiu_blog ( id INT, siteurl STRING); |
9 | OK |
10 | Time taken: 0.359 seconds |
11 | hive> SHOW TABLES; |
12 | OK |
13 | micmiu_blog |
14 | Time taken: 0.023 seconds, Fetched: 1 row(s) |
15 | hive> |
参考:
https://cwiki.apache.org/confluence/display/Hive/Home https://cwiki.apache.org/confluence/display/Hive/GettingStarted
https://cwiki.apache.org/confluence/display/Hive/Hive+Schema+Tool https://cwiki.apache.org/confluence/display/Hive/AdminManual+MetastoreAdmin
—————– EOF @Michael Sun —————–
相关文章推荐
- Hadoop常见问题及解决办法
- HDU 5495 LCS
- Hadoop 报错be replicated to 0 nodes, instead of 1
- Hadoop name -format后Incompatible namespaceIDS 错误解决办法
- 【PA2015】【BZOJ4291】Kieszonkowe
- Hadoop Failed to set permissions of path
- HMaster启动后自动关闭
- Failed to set permissions of path:\tmp\hadoop-yth\mapred\staging\yth-2036315919\.staging to 0700
- Failed to list databases
- 【bzoj1801】【AHOI2009】【chess中国象棋】【组合数学】
- FAILED: Error in metadata: java.lang.RuntimeException: Unable to instantiate org.apache.解决办法
- FAILED: Error in metadata: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.
- Error loading: \Java\jdk1.6.0_35\jre\bin\server\jvm.dll
- Could not buffer record
- CentOS Linux解决Device eth0 does not seem to be present
- Caused by: java.lang.IllegalArgumentException: java.net.UnknownHostException: testcluster
- ByteBuffer写入数组BufferUnderflowException异常
- Broken pipe."解决
- Access denied for user 'root'@'hadoop1master' (using password: YES)
- Maven 生命周期