Hadoop cannot find namenode pid file when shutdown
2014-01-06 16:24
357 查看
By default settings Hadoop use /tmp/ folder to track NameNode and JobTracker process id alive. But /tmp folder is cleanup once a week. So after some time, you will get error when you try to shutdown Hadoop cluster. You have to shutdown hadoop cluster manually.
And don't use default setting to track pid files.
1. Shutdown hadoop cluster manually.
Login to every node for the command :
2. Change Hadoop default setting.
Revise hadoop-env.sh file in ${HADOOP_HOME}/conf, add somethings like below.
3. Restart your cluster, you should find some pid files in that folder.
And don't use default setting to track pid files.
1. Shutdown hadoop cluster manually.
Login to every node for the command :
jps | grep 'TaskTracker\|DataNode' | sed 's/ .*//g' | sed 's/^/kill -9 /g' | shThis code is ugly but works. You can try some graceful way to do this, like :
jps | grep 'TaskTracker\|DataNode' | awk 'echo $1' | xargs kill -9 # I've not try this.
2. Change Hadoop default setting.
Revise hadoop-env.sh file in ${HADOOP_HOME}/conf, add somethings like below.
export HADOOP_PID_DIR=${HADOOP_HOME}/pid
3. Restart your cluster, you should find some pid files in that folder.
ls -l -rw-r--r-- 1 ana grid 6 Nov 13 01:54 hadoop-ana-jobtracker.pid -rw-r--r-- 1 ana grid 6 Nov 13 01:54 hadoop-ana-namenode.pid -rw-r--r-- 1 ana grid 6 Nov 13 01:54 hadoop-ana-secondarynamenode.pid
相关文章推荐
- Hadoop Cannot create file Name node is in safe mode
- Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode.解决方法
- Under the Hood: Hadoop Distributed Filesystem reliability with Namenode and Avatarnode
- Error: Cannot find module 'core-js/fn/array/values' at Function.Module._resolveFilename (module
- Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode
- java.io.FileNotFoundException: /home/hadoop/hadoop/dfs/namenode/current/VERSION (Permission denied)
- Hadoop 异常记录 Cannot delete /tmp/hadoop/mapred/system. Name node is in safe mode.
- org.apache.hadoop.dfs.SafeModeException: Cannot create ***. Name node is in safe mode的解决
- Hadoop: HDFS 格式化时,出现 “ERROR namenode.NameNode: java.io.IOException: Cannot create directory /usr/hadoop/tmp/dfs/name/current”
- NameNode Recovery Tools for the Hadoop Distributed File System
- hadoop 异常记录 Cannot delete /tmp/hadoop/mapred/system. Name node is in safe mode.
- hadoop mkdir: Cannot create directory /usr. Name node is in safe mode.
- 阿里云 esc 云服务器无缘无故的一次/usr/bin 目录丢失导致整个服务无法启动 # ./shutdown.sh ./shutdown.sh: line 41: dirname:command not found cannot find /catalina.sh the find /catalina.sh The file is absent or does not have execute
- 【rabbitmq】Warning: PID file not written; -detached was passed. ERROR: node with name "rabbit" alread
- hadoop安全模式(rm: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete /sort. Name )
- [hadoop]Cannot create directory /mdrill/tablelist/fact_seller_all_d. Name node is in safe mode.
- 启动hadoop报ERROR org.apache.hadoop.hdfs.server.namenode.FSImage: Failed to load image from FSImageFile
- Hadoop "Cannot create directory .Name node is in safe mode."解决方案
- Hadoop集群不显示datanode(Datanode denied communication with namenode because hostname cannot be resolved)
- org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/hive/warehouse/page_view. Name node is in safe mode