您的位置:首页 > 数据库

搭建sparksql的hive测试环境

2016-03-31 15:59 597 查看
sbt依赖

name := "Pi"
version := "1.0"
scalaVersion := "2.10.6"

libraryDependencies++= Seq(
"org.apache.spark" %% "spark-core" % "1.5.2",
"org.apache.spark" % "spark-hive_2.10" % "1.5.2",
"joda-time" % "joda-time" % "2.9.2"
)

resolvers+="OS China" at "http://maven.oschina.net/content/groups/public/"


import org.apache.spark._
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.hive.HiveContext

/**
* Created by code-pc on 16/3/14.
*/
object Pi {

def main(args: Array[String]) {

val conf = new SparkConf().setMaster("local[5]").setAppName("AndrzejApp")
val sc=new SparkContext(conf)

val hqlc=new HiveContext(sc)
val st=hqlc.sql("show tables")
println("hello")
st.collect().foreach(println)
}
}


内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: