您的位置:首页 > 运维架构

hadoop命令和目录

2013-10-18 14:03 309 查看
一 命令概述:

(1)hadoop下的显示:

Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
namenode -format     format the DFS filesystem
secondarynamenode    run the DFS secondary namenode
namenode             run the DFS namenode
datanode             run a DFS datanode
dfsadmin             run a DFS admin client
mradmin              run a Map-Reduce admin client
fsck                 run a DFS filesystem checking utility
fs                   run a generic filesystem user client
balancer             run a cluster balancing utility
fetchdt              fetch a delegation token from the NameNode
jobtracker           run the MapReduce job Tracker node
pipes                run a Pipes job
tasktracker          run a MapReduce task Tracker node
historyserver        run job history servers as a standalone daemon
job                  manipulate MapReduce jobs
queue                get information regarding JobQueues
version              print the version
jar <jar>            run a jar file
distcp <srcurl> <desturl> copy file or directories recursively
archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
classpath            prints the class path needed to get the
Hadoop jar and the required libraries
daemonlog            get/set the log level for each daemon
or
CLASSNAME            run the class named CLASSNAME
Most commands print help when invoked w/o parameters.


二 编译和运行:
(1)编译
*javac -classpath
${HADOOP_VERSION}-core.jar wordcount/*
*${HADOOP_VERSION}-core.jar 在hadoop安装目录下,是Hadoop代码编译后的jar包。

(2)运行
*hadoop jar <jar> [mainClass] args...
*例如:hadoop jar wordcount.jar 包名.wordcount input output

三 Hadoop目录(常用):
(1)src:hadoop源代码。
*core:基础代码库。
*hdfs:HDFS实现代码。
*mapred:MapReduce实现代码。
*tools:Hadoop工具,比如日志分析等。
*C++:c++代码,如pipes等。
(2)conf:各种配置文件。
(3)lib:运行时依赖的第三方库。
(4)bin:Hadoop管理脚本。
(5)hadoop-core-xxx.jar:Hadoop代码编译后的jar包。
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: