hbase与hive集成:hive读取hbase中数据
2017-10-10 09:18
405 查看
1、创建hbase jar包到hive lib目录软连接
hive需要jar包:
hive-hbase-handler-0.13.1-cdh5.3.6.jar
zookeeper-3.4.5-cdh5.3.6.jar
guava-12.0.1.jar --删除hive下低版本jar包,cp hbase中的到hive/lib/下(即需要注意jar包版本)
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-server-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-server-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-client-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-client-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-protocol-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-protocol-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-it-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-it-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/htrace-core-2.04.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/htrace-core-2.04.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-hadoop2-compat-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-hadoop2-compat-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-hadoop-compat-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-hadoop-compat-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/high-scale-lib-1.1.1.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/high-scale-lib-1.1.1.jar
2、配置hive-site.xml文件,加入zookeeper属性:
hbase.zookeeper.quorum
3、配置环境变量:
export HBASE_HOME=/opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6
export HADOOP_HOME=/opt/cdh-5.3.6/hadoop-2.5.0-cdh5.3.6
HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase mapredcp`:${HBASE_HOME}/conf
4、hive集成hbase中dept表(外部表:hbase中已经存在的表,想要在hive中使用sql进行分析在hive应该创建外部表):
CREATE EXTERNAL TABLE hbase_dept(
deptno string,
dname string,
loc string)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,info:dname,info:loc")
TBLPROPERTIES ("hbase.table.name" = "dept");
hive需要jar包:
hive-hbase-handler-0.13.1-cdh5.3.6.jar
zookeeper-3.4.5-cdh5.3.6.jar
guava-12.0.1.jar --删除hive下低版本jar包,cp hbase中的到hive/lib/下(即需要注意jar包版本)
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-server-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-server-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-client-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-client-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-protocol-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-protocol-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-it-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-it-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/htrace-core-2.04.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/htrace-core-2.04.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-hadoop2-compat-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-hadoop2-compat-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/hbase-hadoop-compat-0.98.6-cdh5.3.6.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/hbase-hadoop-compat-0.98.6-cdh5.3.6.jar
ln -s /opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6/lib/high-scale-lib-1.1.1.jar /opt/cdh-5.3.6/hive-0.13.1-cdh5.3.6/lib/high-scale-lib-1.1.1.jar
2、配置hive-site.xml文件,加入zookeeper属性:
hbase.zookeeper.quorum
3、配置环境变量:
export HBASE_HOME=/opt/cdh-5.3.6/hbase-0.98.6-cdh5.3.6
export HADOOP_HOME=/opt/cdh-5.3.6/hadoop-2.5.0-cdh5.3.6
HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase mapredcp`:${HBASE_HOME}/conf
4、hive集成hbase中dept表(外部表:hbase中已经存在的表,想要在hive中使用sql进行分析在hive应该创建外部表):
CREATE EXTERNAL TABLE hbase_dept(
deptno string,
dname string,
loc string)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,info:dname,info:loc")
TBLPROPERTIES ("hbase.table.name" = "dept");
相关文章推荐
- 使用hive读取hbase数据
- 使用Hive读取Hbase中的数据
- 使用Hive读取Hbase中的数据
- 使用Hive读取Hbase中的数据
- 使用Hive读取Hbase中的数据
- Hive集成HBase查询数据表
- 使用hive读取hbase数据
- hadoop2.2.0(单节点)下Sqoop-1.4.6安装与配置(数据读取涉及hadoop、hbase和hive)
- 使用Hive读取Hbase中的数据
- hadoop2.6.0(单节点)下Sqoop-1.4.6安装与配置(数据读取涉及hadoop、hbase和hive)
- spark-sql读取映射hbase数据的hive外部表
- Spark,Hive,HBase相互结合--数据读取和计算的几种方式
- 使用Hive读取Hbase中的数据
- 使用hive读取hbase数据
- 使用Hive读取Hbase中的数据
- spark-sql读取映射hbase数据的hive外部表
- hive 0.12 使用hbase读取数据的函数调用栈
- 使用hive读取hbase数据
- SparkSQL读取HBase数据,通过自定义外部数据源(hbase的Hive外关联表)
- 读取hive文件并将数据导入hbase