您的位置:首页 > 编程语言

Spark读码笔记之精简cluster运行代码

2017-01-23 15:01 393 查看

启动Master

var conf = new SparkConf()
Master.startRpcEnvAndEndpoint("localhost", 60001, 60002, conf)
Thread.currentThread().join()


启动Worker01

var conf = new SparkConf()
Worker.startRpcEnvAndEndpoint("localhost", 61001, 61002, 2, 1024, Array("spark://localhost:60001"), "D:\\YCloud\\PRGs\\spark\\learning\\cluster\\worker01", Some(1), conf)
Thread.currentThread().join()


启动Work02

var conf = new SparkConf()
Worker.startRpcEnvAndEndpoint("localhost", 62001, 62002, 2, 1024, Array("spark://localhost:60001"), "D:\\YCloud\\PRGs\\spark\\learning\\cluster\\worker02", Some(2), conf)
Thread.currentThread().join()


提交Job

var conf = new SparkConf()
.setMaster("spark://localhost:60001")
.setAppName("myApp")
.set("spark.local.dir", "D:\\YCloud\\PRGs\\spark\\learning\\cluster\\context")
var context = new SparkContext(conf)
Thread.sleep(10000);
println("end of executor")

var rdd = context.makeRDD(Array(1,2,3,4))
var processPartition = {(_: Iterator[Int]) => println("OK.");0}
var partitions = Array(0,1,2,3)
var resultHandler = {(_: Int, _: Int) => }
var resultFunc = { () =>  }

for (_ <- 1 to 10) {
context.submitJob(rdd, processPartition, partitions, resultHandler, resultFunc)
Thread.sleep(2000)
println("end of job")
}

var processPartition1 = {(_: Iterator[Int]) => var i=0L; while(true) { println(i);Thread.sleep(1000);i+=1};0}
context.submitJob(rdd, processPartition1, partitions, resultHandler, resultFunc)

Thread.currentThread().join()


Spark Jobs WebUI

内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  spark cluster job 延云 ydb