spark 启动
2016-06-11 12:58
357 查看
[root@name01 conf]# pwd /usr/local/spark/spark-1.4.0/conf [root@name01 conf]# ../sbin/start-dfs.sh bash: ../sbin/start-dfs.sh: No such file or directory [root@name01 conf]# ls docker.properties.template log4j.properties.template slaves spark-defaults.conf spark-env.sh fairscheduler.xml.template metrics.properties.template slaves.template spark-defaults.conf.template spark-env.sh.template [root@name01 conf]# ls -ltr total 44 -rwxr-xr-x 1 hadoop hadoop 3318 Jun 2 2015 spark-env.sh.template -rw-r--r-- 1 hadoop hadoop 507 Jun 2 2015 spark-defaults.conf.template -rw-r--r-- 1 hadoop hadoop 80 Jun 2 2015 slaves.template -rw-r--r-- 1 hadoop hadoop 5565 Jun 2 2015 metrics.properties.template -rw-r--r-- 1 hadoop hadoop 632 Jun 2 2015 log4j.properties.template -rw-r--r-- 1 hadoop hadoop 303 Jun 2 2015 fairscheduler.xml.template -rw-r--r-- 1 hadoop hadoop 202 Jun 2 2015 docker.properties.template -rw-r--r-- 1 root root 80 Jun 10 20:17 slaves -rwxr-xr-x 1 root root 3927 Jun 10 20:42 spark-env.sh -rw-r--r-- 1 root root 507 Jun 10 21:38 spark-defaults.conf [root@name01 conf]# cd .. [root@name01 spark-1.4.0]# ls -ltr total 668 drwxr-xr-x 2 hadoop hadoop 4096 Jun 2 2015 sbin -rw-r--r-- 1 hadoop hadoop 134 Jun 2 2015 RELEASE -rw-r--r-- 1 hadoop hadoop 3624 Jun 2 2015 README.md drwxr-xr-x 3 hadoop hadoop 4096 Jun 2 2015 R drwxr-xr-x 6 hadoop hadoop 4096 Jun 2 2015 python -rw-r--r-- 1 hadoop hadoop 22559 Jun 2 2015 NOTICE -rw-r--r-- 1 hadoop hadoop 50902 Jun 2 2015 LICENSE drwxr-xr-x 2 hadoop hadoop 4096 Jun 2 2015 lib drwxr-xr-x 3 hadoop hadoop 4096 Jun 2 2015 examples drwxr-xr-x 3 hadoop hadoop 4096 Jun 2 2015 ec2 drwxr-xr-x 3 hadoop hadoop 4096 Jun 2 2015 data -rw-r--r-- 1 hadoop hadoop 561149 Jun 2 2015 CHANGES.txt drwxr-xr-x 2 hadoop hadoop 4096 Jun 2 2015 bin drwxr-xr-x 2 hadoop hadoop 4096 Jun 10 21:38 conf [root@name01 spark-1.4.0]# sbin/start-dfs.sh bash: sbin/start-dfs.sh: No such file or directory [root@name01 spark-1.4.0]# jps 6273 MainGenericRunner 3334 RunJar 3015 NodeManager 2600 DataNode 7016 Jps 2922 ResourceManager 2749 SecondaryNameNode 2510 NameNode [root@name01 spark-1.4.0]# cd .. [root@name01 spark]# cd spark-1.4.0 [root@name01 spark-1.4.0]# cd .. [root@name01 spark]# cd spark-1.4.0 [root@name01 spark-1.4.0]# ls -ltr total 668 drwxr-xr-x 2 hadoop hadoop 4096 Jun 2 2015 sbin -rw-r--r-- 1 hadoop hadoop 134 Jun 2 2015 RELEASE -rw-r--r-- 1 hadoop hadoop 3624 Jun 2 2015 README.md drwxr-xr-x 3 hadoop hadoop 4096 Jun 2 2015 R drwxr-xr-x 6 hadoop hadoop 4096 Jun 2 2015 python -rw-r--r-- 1 hadoop hadoop 22559 Jun 2 2015 NOTICE -rw-r--r-- 1 hadoop hadoop 50902 Jun 2 2015 LICENSE drwxr-xr-x 2 hadoop hadoop 4096 Jun 2 2015 lib drwxr-xr-x 3 hadoop hadoop 4096 Jun 2 2015 examples drwxr-xr-x 3 hadoop hadoop 4096 Jun 2 2015 ec2 drwxr-xr-x 3 hadoop hadoop 4096 Jun 2 2015 data -rw-r--r-- 1 hadoop hadoop 561149 Jun 2 2015 CHANGES.txt drwxr-xr-x 2 hadoop hadoop 4096 Jun 2 2015 bin drwxr-xr-x 2 hadoop hadoop 4096 Jun 10 21:38 conf [root@name01 spark-1.4.0]# cd sbin/ [root@name01 sbin]# ls -ltr total 84 -rwxr-xr-x 1 hadoop hadoop 1012 Jun 2 2015 stop-thriftserver.sh -rwxr-xr-x 1 hadoop hadoop 1175 Jun 2 2015 stop-slaves.sh -rwxr-xr-x 1 hadoop hadoop 1478 Jun 2 2015 stop-slave.sh -rwxr-xr-x 1 hadoop hadoop 1013 Jun 2 2015 stop-shuffle-service.sh -rwxr-xr-x 1 hadoop hadoop 1041 Jun 2 2015 stop-mesos-dispatcher.sh -rwxr-xr-x 1 hadoop hadoop 1123 Jun 2 2015 stop-master.sh -rwxr-xr-x 1 hadoop hadoop 1002 Jun 2 2015 stop-history-server.sh -rwxr-xr-x 1 hadoop hadoop 1386 Jun 2 2015 stop-all.sh -rwxr-xr-x 1 hadoop hadoop 1792 Jun 2 2015 start-thriftserver.sh -rwxr-xr-x 1 hadoop hadoop 1920 Jun 2 2015 start-slaves.sh -rwxr-xr-x 1 hadoop hadoop 2817 Jun 2 2015 start-slave.sh -rwxr-xr-x 1 hadoop hadoop 1212 Jun 2 2015 start-shuffle-service.sh -rwxr-xr-x 1 hadoop hadoop 1555 Jun 2 2015 start-mesos-dispatcher.sh -rwxr-xr-x 1 hadoop hadoop 1881 Jun 2 2015 start-master.sh -rwxr-xr-x 1 hadoop hadoop 1480 Jun 2 2015 start-history-server.sh -rwxr-xr-x 1 hadoop hadoop 1267 Jun 2 2015 start-all.sh -rwxr-xr-x 1 hadoop hadoop 1176 Jun 2 2015 spark-daemons.sh -rwxr-xr-x 1 hadoop hadoop 5184 Jun 2 2015 spark-daemon.sh -rwxr-xr-x 1 hadoop hadoop 1609 Jun 2 2015 spark-config.sh -rwxr-xr-x 1 hadoop hadoop 2749 Jun 2 2015 slaves.sh [root@name01 sbin]# jps 6273 MainGenericRunner 3334 RunJar 3015 NodeManager 2600 DataNode 2922 ResourceManager 7036 Jps 2749 SecondaryNameNode 2510 NameNode [root@name01 sbin]# start-master.sh bash: start-master.sh: command not found [root@name01 sbin]# pwd /usr/local/spark/spark-1.4.0/sbin [root@name01 sbin]# ls -ltr total 84 -rwxr-xr-x 1 hadoop hadoop 1012 Jun 2 2015 stop-thriftserver.sh -rwxr-xr-x 1 hadoop hadoop 1175 Jun 2 2015 stop-slaves.sh -rwxr-xr-x 1 hadoop hadoop 1478 Jun 2 2015 stop-slave.sh -rwxr-xr-x 1 hadoop hadoop 1013 Jun 2 2015 stop-shuffle-service.sh -rwxr-xr-x 1 hadoop hadoop 1041 Jun 2 2015 stop-mesos-dispatcher.sh -rwxr-xr-x 1 hadoop hadoop 1123 Jun 2 2015 stop-master.sh -rwxr-xr-x 1 hadoop hadoop 1002 Jun 2 2015 stop-history-server.sh -rwxr-xr-x 1 hadoop hadoop 1386 Jun 2 2015 stop-all.sh -rwxr-xr-x 1 hadoop hadoop 1792 Jun 2 2015 start-thriftserver.sh -rwxr-xr-x 1 hadoop hadoop 1920 Jun 2 2015 start-slaves.sh -rwxr-xr-x 1 hadoop hadoop 2817 Jun 2 2015 start-slave.sh -rwxr-xr-x 1 hadoop hadoop 1212 Jun 2 2015 start-shuffle-service.sh -rwxr-xr-x 1 hadoop hadoop 1555 Jun 2 2015 start-mesos-dispatcher.sh -rwxr-xr-x 1 hadoop hadoop 1881 Jun 2 2015 start-master.sh -rwxr-xr-x 1 hadoop hadoop 1480 Jun 2 2015 start-history-server.sh -rwxr-xr-x 1 hadoop hadoop 1267 Jun 2 2015 start-all.sh -rwxr-xr-x 1 hadoop hadoop 1176 Jun 2 2015 spark-daemons.sh -rwxr-xr-x 1 hadoop hadoop 5184 Jun 2 2015 spark-daemon.sh -rwxr-xr-x 1 hadoop hadoop 1609 Jun 2 2015 spark-config.sh -rwxr-xr-x 1 hadoop hadoop 2749 Jun 2 2015 slaves.sh [root@name01 sbin]# start-slaves.sh bash: start-slaves.sh: command not found [root@name01 sbin]# source ~/.bash_profile [root@name01 sbin]# start-slaves.sh bash: start-slaves.sh: command not found [root@name01 sbin]# start-master.sh bash: start-master.sh: command not found [root@name01 sbin]# cd hadoop/ bash: cd: hadoop/: No such file or directory [root@name01 sbin]# source /etc/profile [root@name01 sbin]# start-master.sh starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/spark-1.4.0/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-name01.out [root@name01 sbin]# start-slaves.sh localhost: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/spark-1.4.0/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-name01.out [root@name01 sbin]# jps 7408 Jps 6273 MainGenericRunner 3334 RunJar 7158 Master 3015 NodeManager 7335 Worker 2600 DataNode 2922 ResourceManager 2749 SecondaryNameNode 2510 NameNode [root@name01 sbin]# spark-shell 16/06/10 21:52:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 16/06/10 21:52:27 INFO spark.SecurityManager: Changing view acls to: root 16/06/10 21:52:27 INFO spark.SecurityManager: Changing modify acls to: root 16/06/10 21:52:27 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 16/06/10 21:52:28 INFO spark.HttpServer: Starting HTTP Server 16/06/10 21:52:28 INFO server.Server: jetty-8.y.z-SNAPSHOT 16/06/10 21:52:28 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:47704 16/06/10 21:52:28 INFO util.Utils: Successfully started service 'HTTP class server' on port 47704. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.4.0 /_/ Using Scala version 2.10.4 (Java HotSpot(TM) Client VM, Java 1.8.0_20) Type in expressions to have them evaluated. Type :help for more information. 16/06/10 21:53:07 WARN util.Utils: Your hostname, name01 resolves to a loopback address: 127.0.0.1; using 192.168.0.105 instead (on interface eth4) 16/06/10 21:53:07 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address 16/06/10 21:53:07 INFO spark.SparkContext: Running Spark version 1.4.0 16/06/10 21:53:07 INFO spark.SecurityManager: Changing view acls to: root 16/06/10 21:53:07 INFO spark.SecurityManager: Changing modify acls to: root 16/06/10 21:53:07 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 16/06/10 21:53:09 INFO slf4j.Slf4jLogger: Slf4jLogger started 16/06/10 21:53:09 INFO Remoting: Starting remoting 16/06/10 21:53:11 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.0.105:56051] 16/06/10 21:53:11 INFO util.Utils: Successfully started service 'sparkDriver' on port 56051. 16/06/10 21:53:11 INFO spark.SparkEnv: Registering MapOutputTracker 16/06/10 21:53:11 INFO spark.SparkEnv: Registering BlockManagerMaster 16/06/10 21:53:12 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-f8c38598-453f-44cd-ac29-19d3749ac919/blockmgr-88e4f13f-37fe-49e2-85d0-5beba92e9907 16/06/10 21:53:12 INFO storage.MemoryStore: MemoryStore started with capacity 267.3 MB 16/06/10 21:53:13 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-f8c38598-453f-44cd-ac29-19d3749ac919/httpd-a2e61e27-a091-433e-ae24-850e63e4fc63 16/06/10 21:53:13 INFO spark.HttpServer: Starting HTTP Server 16/06/10 21:53:13 INFO server.Server: jetty-8.y.z-SNAPSHOT 16/06/10 21:53:13 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:44811 16/06/10 21:53:13 INFO util.Utils: Successfully started service 'HTTP file server' on port 44811. 16/06/10 21:53:13 INFO spark.SparkEnv: Registering OutputCommitCoordinator 16/06/10 21:53:13 INFO server.Server: jetty-8.y.z-SNAPSHOT 16/06/10 21:53:14 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040 16/06/10 21:53:14 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 16/06/10 21:53:14 INFO ui.SparkUI: Started SparkUI at http://192.168.0.105:4040 16/06/10 21:53:14 INFO executor.Executor: Starting executor ID driver on host localhost 16/06/10 21:53:14 INFO executor.Executor: Using REPL class URI: http://192.168.0.105:47704 16/06/10 21:53:16 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54439. 16/06/10 21:53:16 INFO netty.NettyBlockTransferService: Server created on 54439 16/06/10 21:53:16 INFO storage.BlockManagerMaster: Trying to register BlockManager 16/06/10 21:53:16 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:54439 with 267.3 MB RAM, BlockManagerId(driver, localhost, 54439) 16/06/10 21:53:16 INFO storage.BlockManagerMaster: Registered BlockManager 16/06/10 21:53:17 INFO repl.SparkILoop: Created spark context.. Spark context available as sc. 16/06/10 21:53:23 INFO hive.HiveContext: Initializing execution hive, version 0.13.1 16/06/10 21:53:28 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 16/06/10 21:53:29 INFO metastore.ObjectStore: ObjectStore, initialize called 16/06/10 21:53:30 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 16/06/10 21:53:30 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored 16/06/10 21:53:31 WARN DataNucleus.Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 16/06/10 21:53:39 WARN DataNucleus.Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 16/06/10 21:53:42 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 16/06/10 21:53:42 INFO metastore.MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "@" (64), after : "". 16/06/10 21:53:44 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 16/06/10 21:53:44 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 16/06/10 21:53:47 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 16/06/10 21:53:47 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 16/06/10 21:53:47 INFO metastore.ObjectStore: Initialized ObjectStore 16/06/10 21:53:48 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa 16/06/10 21:53:50 INFO metastore.HiveMetaStore: Added admin role in metastore 16/06/10 21:53:50 INFO metastore.HiveMetaStore: Added public role in metastore 16/06/10 21:53:50 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty 16/06/10 21:53:51 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr. 16/06/10 21:53:51 INFO repl.SparkILoop: Created sql context (with Hive support).. SQL context available as sqlContext. scala> 5+8 res0: Int = 13 scala> 2+9 res1: Int = 11 scala>
相关文章推荐
- Spark RDD API详解(一) Map和Reduce
- 使用spark和spark mllib进行股票预测
- Windows下Scala环境搭建
- Spark随谈——开发指南(译)
- Spark,一种快速数据分析替代方案
- Windows7下安装Scala 2.9.2教程
- eclipse 开发 spark Streaming wordCount
- Understanding Spark Caching
- Scala代码实现列出Hadoop 文件夹下面的所有文件
- ClassNotFoundException:scala.PreDef$
- Windows 下Spark 快速搭建Spark源码阅读环境
- Spark中将对象序列化存储到hdfs
- sbt创建web项目
- 使用java代码提交Spark的hive sql任务,run as java application
- XML 文件解析--含Unicode字符的XML文件
- Scala 学习随笔
- Scala 小程序记录(学习期间的代码片段)
- Spark机器学习(一) -- Machine Learning Library (MLlib)
- Spark机器学习(二) 局部向量 Local-- Data Types - MLlib