hadoop源代码分析(4)-org.apache.hadoop.util包-GenericOptionsParser类【原创】
2013-02-21 17:03
483 查看
一 准备
hadoop版本:1.0.3,GenericOptionsParser所在的包:org.apache.hadoop.util学习方法:理解GenericOptionsParser这个解析器的使用地方,从构造函数入手,理解GenericOptionsParser整个类的使用情况。
时间:2013-02-21
二 GenericOptionsParser功能描述
GenericOptionsParser是hadoop框架中解析命令行参数的基本类。它能够辨别一些标准的命令行参数,能够使应用程序轻易地指定namenode,jobtracker,以及其他额外的配置资源。一般它使用方法如下:View processGeneralOptions Code
/** * 根据用户指定的参数修改配置 * Modify configuration according user-specified generic options * @param conf Configuration to be modified * @param line User-specified generic options */ private void processGeneralOptions(Configuration conf, CommandLine line) throws IOException { if (line.hasOption("fs")) { // 设置NAMENODE的ip FileSystem.setDefaultUri(conf, line.getOptionValue("fs")); } if (line.hasOption("jt")) { conf.set("mapred.job.tracker", line.getOptionValue("jt")); } if (line.hasOption("conf")) { String[] values = line.getOptionValues("conf"); for(String value : values) { // 新增配置文件,除非是final属性,不然新配置文件会覆盖旧的配置文件 conf.addResource(new Path(value)); } } if (line.hasOption("libjars")) { conf.set("tmpjars", validateFiles(line.getOptionValue("libjars"), conf)); //setting libjars in client classpath URL[] libjars = getLibJars(conf); if(libjars!=null && libjars.length>0) { conf.setClassLoader(new URLClassLoader(libjars, conf.getClassLoader())); Thread.currentThread().setContextClassLoader( new URLClassLoader(libjars, Thread.currentThread().getContextClassLoader())); } } if (line.hasOption("files")) { conf.set("tmpfiles", validateFiles(line.getOptionValue("files"), conf)); } if (line.hasOption("archives")) { conf.set("tmparchives", validateFiles(line.getOptionValue("archives"), conf)); } if (line.hasOption('D')) { String[] property = line.getOptionValues('D'); for(String prop : property) { String[] keyval = prop.split("=", 2); if (keyval.length == 2) { conf.set(keyval[0], keyval[1]); } } } conf.setBoolean("mapred.used.genericoptionsparser", true); // tokensFile if(line.hasOption("tokenCacheFile")) { String fileName = line.getOptionValue("tokenCacheFile"); // check if the local file exists try { FileSystem localFs = FileSystem.getLocal(conf); Path p = new Path(fileName); if (!localFs.exists(p)) { throw new FileNotFoundException("File "+fileName+" does not exist."); } LOG.debug("setting conf tokensFile: " + fileName); conf.set("mapreduce.job.credentials.json", localFs.makeQualified(p).toString()); } catch (IOException e) { throw new RuntimeException(e); } } }
五 GenericOptionsParser相关类、接口简述
跟这个类相关的类是:Options,Option,ComandLine,FileSystem。六 结语
原文出处:http://www.cnblogs.com/caoyuanzhanlang==========================================================================================================
=================================== 以上分析仅代表个人观点,欢迎指正与交流 ===================================
=================================== 尊重劳动成果,转载请注明出处,万分感谢 ===================================
==========================================================================================================
相关文章推荐
- 在hadoop集群上运行mapreduce程序时报错“org.apache.hadoop.util.Shell$ExitCodeException:***not found”
- java.lang.ClassNotFoundException: org.apache.hadoop.util.PlatformName
- org.apache.hadoop.yarn.conf.ConfigurationProviderFactory分析加载配置文件两种方式
- Apache Hadoop Pig 源代码分析(2)
- Exception from container-launch: org.apache.hadoop.util.Shell$ExitCodeException
- hadoop源码分析系列(六)——org.apache.hadoop.hdfs包之nameNode篇
- spark on yarn 报 org.apache.hadoop.util.Shell$ExitCodeException: 问题
- HBase源码分析之org.apache.hadoop.hbase.catalog包
- hadoop错误org.apache.hadoop.util.DiskChecker$DiskErrorException Could not find any valid local directory for
- Exception from container-launch: org.apache.hadoop.util.Shell$ExitCodeException
- hadoop源码分析系列(七)——org.apache.hadoop.hdfs包完结篇——dataNode详解及总结
- hadoop namenode -format错误,找不到或无法加载主类org.apache.hadoop.util.PlatformName
- java.lang.NoSuchMethodError: org.apache.hadoop.fs.FSOutputSummer.<init>(Ljava/util/zip/Checksum;II)V
- hadoop源代码分析(1)-hdfs.server.datanode包-DataNode类【原创】
- WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
- hadoop解决Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/Apps
- RM源码之org.apache.hadoop.service.AbstractService分析
- java.lang.NoSuchMethodError: org.apache.hadoop.fs.FSOutputSummer.<init>(Ljava/util/zip/Checksum;II)V
- hadoop源代码分析(2)-hdfs.server.datanode包-DataXceiverServer类【原创】
- Nutch org.apache.hadoop.util.DiskChecker$DiskErrorException