"hadoop fs -conf " does not work
2013-03-13 22:38
363 查看
<<hadoop:The Definitive guide>> 150L
the following is written.
----------------------------------------------------------------------------------------------------------------------
With this setup, it is easy to use any configuration with the -conf command-line switch.
For example, the following command shows a directory listing on the HDFS server
running in pseudodistributed mode on localhost:
% hadoop fs -conf conf/hadoop-localhost.xml -ls .
Found 2 items
drwxr-xr-x- tom supergroup
0 2009-04-08 10:32 /user/tom/input
drwxr-xr-x- tom supergroup
0 2009-04-08 13:09 /user/tom/output
-----------------------------------------------------------------------------------------------------------------------
But i tested and found that the command " hadoop "fs -conf conf/hadoop-localhost.xml -ls ." cannot work. The following results was gotten.
---------------------------------------------
13/03/13 22:28:41 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
----------------------------------------------
what's wrong?what's wrong?
I executeed "hadoop fs"
---------------------------------------------
desktop:~$ hadoop fs
Usage: java FsShell
[-ls <path>]
[-lsr <path>]
[-du <path>]
[-dus <path>]
[-count[-q] <path>]
[-mv <src> <dst>]
[-cp <src> <dst>]
[-rm [-skipTrash] <path>]
[-rmr [-skipTrash] <path>]
[-expunge]
[-put <localsrc> ... <dst>]
[-copyFromLocal <localsrc> ... <dst>]
[-moveFromLocal <localsrc> ... <dst>]
[-get [-ignoreCrc] [-crc] <src> <localdst>]
[-getmerge <src> <localdst> [addnl]]
[-cat <src>]
[-text <src>]
[-copyToLocal [-ignoreCrc] [-crc] <src> <localdst>]
[-moveToLocal [-crc] <src> <localdst>]
[-mkdir <path>]
[-setrep [-R] [-w] <rep> <path/file>]
[-touchz <path>]
[-test -[ezd] <path>]
[-stat [format] <path>]
[-tail [-f] <file>]
[-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
[-chown [-R] [OWNER][:[GROUP]] PATH...]
[-chgrp [-R] GROUP PATH...]
[-help [cmd]]
Generic options supported are
-conf <configuration file> specify an application configuration file
-D <property=value> use value for given property
-fs <local|namenode:port> specify a namenode
-jt <local|jobtracker:port> specify a job tracker
-files <comma separated list of files> specify comma separated files to be copied to the map reduce cluster
-libjars <comma separated list of jars> specify comma separated jar files to include in the classpath.
-archives <comma separated list of archives> specify comma separated archives to be unarchived on the compute machines.
The general command line syntax is
bin/hadoop command [genericOptions] [commandOptions]
----------------------------------------------
the following is written.
----------------------------------------------------------------------------------------------------------------------
With this setup, it is easy to use any configuration with the -conf command-line switch.
For example, the following command shows a directory listing on the HDFS server
running in pseudodistributed mode on localhost:
% hadoop fs -conf conf/hadoop-localhost.xml -ls .
Found 2 items
drwxr-xr-x- tom supergroup
0 2009-04-08 10:32 /user/tom/input
drwxr-xr-x- tom supergroup
0 2009-04-08 13:09 /user/tom/output
-----------------------------------------------------------------------------------------------------------------------
But i tested and found that the command " hadoop "fs -conf conf/hadoop-localhost.xml -ls ." cannot work. The following results was gotten.
---------------------------------------------
13/03/13 22:28:41 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
----------------------------------------------
what's wrong?what's wrong?
I executeed "hadoop fs"
---------------------------------------------
desktop:~$ hadoop fs
Usage: java FsShell
[-ls <path>]
[-lsr <path>]
[-du <path>]
[-dus <path>]
[-count[-q] <path>]
[-mv <src> <dst>]
[-cp <src> <dst>]
[-rm [-skipTrash] <path>]
[-rmr [-skipTrash] <path>]
[-expunge]
[-put <localsrc> ... <dst>]
[-copyFromLocal <localsrc> ... <dst>]
[-moveFromLocal <localsrc> ... <dst>]
[-get [-ignoreCrc] [-crc] <src> <localdst>]
[-getmerge <src> <localdst> [addnl]]
[-cat <src>]
[-text <src>]
[-copyToLocal [-ignoreCrc] [-crc] <src> <localdst>]
[-moveToLocal [-crc] <src> <localdst>]
[-mkdir <path>]
[-setrep [-R] [-w] <rep> <path/file>]
[-touchz <path>]
[-test -[ezd] <path>]
[-stat [format] <path>]
[-tail [-f] <file>]
[-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
[-chown [-R] [OWNER][:[GROUP]] PATH...]
[-chgrp [-R] GROUP PATH...]
[-help [cmd]]
Generic options supported are
-conf <configuration file> specify an application configuration file
-D <property=value> use value for given property
-fs <local|namenode:port> specify a namenode
-jt <local|jobtracker:port> specify a job tracker
-files <comma separated list of files> specify comma separated files to be copied to the map reduce cluster
-libjars <comma separated list of jars> specify comma separated jar files to include in the classpath.
-archives <comma separated list of archives> specify comma separated archives to be unarchived on the compute machines.
The general command line syntax is
bin/hadoop command [genericOptions] [commandOptions]
----------------------------------------------
相关文章推荐
- "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file
- "Could not find or load main class" in Hadoop or Java using Maven
- "column "x" does not belong to table "y""
- Bootstrap: command Tomcat 5.5\conf\server.xml" does not exist. eclipe启动tomcat问题解决方法
- "+y does not work
- [转]关于 "the serializable class XXX does not declare a static final seriaVersionUID..."的问题 (2012-03-1
- 邮件发送中的"server does not support secure connection."
- "/S60/devices/S60_3rd_FP2_SDK_v1.1/epoc32/" does not exist"
- 关于java三元运算符,(i%5==0)?System.out.println():System.out.print(" "); does not work!
- "Printer does not support A4 pap…
- "editor does not contain a main type" 错误解决方法
- Why "RefreshRecord" does not work
- hadoop about "Container does not exist."
- HAL Series(2) How does the "Stub" work?
- "ld: library not found for -lz.1.2.3"的问题解决
- "library not found for - "解决办法
- 关于PostgreSQL数据库的 ERROR: relation "tableName" does not exist
- "library not found for - "解决办法
- Fixing the security exception : "class /"" + packageName + "/" does not match trust level of other classes in the same package"
- "EZBOOT kernel not found" DOS返回EZBOOT错误