Spark-sql与hive的结合环境配置
2017-06-06 10:59
330 查看
转:zx老师
########################################
alter database hive character set latin1;
ALTER TABLE hive.* DEFAULT CHARACTER SET latin1;
########################################
windows安装mysql时候,选的字符编码为utf-8, hive只认识latin1。 也可以在新建数据库时候,选择编码格式。
1.安装hive
CREATE USER 'hive'@'%' IDENTIFIED BY '123456';
GRANT all privileges ON hive.* TO 'hive'@'%';
flush privileges;
2.将配置好的hive-site.xml放入$SPARK-HOME/conf目录下
3.启动spark-shell时指定mysql连接驱动位置
bin/spark-shell \
--master spark://node1.itcast.cn:7077 \
--executor-memory 1g \
--total-executor-cores 2 \
--driver-class-path /usr/local/apache-hive-0.13.1-bin/lib/mysql-connector-java-5.1.35-bin.jar
4.使用sqlContext.sql调用HQL
sqlContext.sql("select * from spark.person limit 2")
或使用org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.hive.HiveContext
val hiveContext = new HiveContext(sc)
hiveContext.sql("select * from spark.person")
hive.xml
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://172.16.0.1:3306/hive?createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
<description>password to use against metastore database</description>
</property>
</configuration>
########################################
alter database hive character set latin1;
ALTER TABLE hive.* DEFAULT CHARACTER SET latin1;
########################################
windows安装mysql时候,选的字符编码为utf-8, hive只认识latin1。 也可以在新建数据库时候,选择编码格式。
1.安装hive
CREATE USER 'hive'@'%' IDENTIFIED BY '123456';
GRANT all privileges ON hive.* TO 'hive'@'%';
flush privileges;
2.将配置好的hive-site.xml放入$SPARK-HOME/conf目录下
3.启动spark-shell时指定mysql连接驱动位置
bin/spark-shell \
--master spark://node1.itcast.cn:7077 \
--executor-memory 1g \
--total-executor-cores 2 \
--driver-class-path /usr/local/apache-hive-0.13.1-bin/lib/mysql-connector-java-5.1.35-bin.jar
4.使用sqlContext.sql调用HQL
sqlContext.sql("select * from spark.person limit 2")
或使用org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.hive.HiveContext
val hiveContext = new HiveContext(sc)
hiveContext.sql("select * from spark.person")
hive.xml
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://172.16.0.1:3306/hive?createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
<description>password to use against metastore database</description>
</property>
</configuration>
相关文章推荐
- 第57课:Spark SQL on Hive配置及实战
- window+sparksql+hive+debug sparksql本地环境搭建
- 大数据IMF传奇行动绝密课程第57课:Spark SQL on Hive配置及实战
- Spark-sql 结合hive使用
- spark-sql 结合 hive
- sparkSQL结合hive的入门程序
- SparkSQL on Hive配置与实战
- spark+hive+hdfs windows开发环境配置:
- sparksql读取hive数据源配置
- spark sql on hive配置及其使用
- window+sparksql+hive+debug sparksql本地环境搭建
- spark基础之Spark SQL和Hive的集成以及ThriftServer配置
- 2.2、配置Spark-sql(连接Hive)
- Spark SQL on Hive配置
- spark+hive win7开发环境配置
- 大数据环境部署7:SparkSQL配置使用
- 【Spark篇】---SparkSQL on Hive的配置和使用
- sqlplus环境配置(login.sql)
- 【iOS-cocos2d-X 环境配置】在Mac下结合Xcode搭建Cocos2d-X开发环境!
- sqlplus环境配置(login.sql)