hadoop错误,重新格式化namenode后,出现java.io.IOException Incompatible clusterIDs
2018-02-21 23:23
459 查看
错误: java.io.IOException: Incompatible clusterIDs in /data/dfs/data: namenode clusterID = CID-d1448b9e-da0f-499e-b1d4-78cb18ecdebb; datanode clusterID = CID-ff0faa40-2940-4838-b321-98272eb0dee3原因: 每次namenode format会重新创建一个namenodeId,而data目录包含了上次format时的id,namenode format清空了namenode下的数据,但是没有清空datanode下的数据,导致启动时失败,所要做的就是每次fotmat前,清空data下的所有目录.解决办法: 方法1:停掉集群,删除问题节点的data目录下的所有内容。即hdfs-site.xml文件中配置的dfs.data.dir目录。重新格式化namenode。 方法2:先停掉集群,然后将datanode节点目录/dfs/data/current/VERSION中的修改为与namenode一致即可转发自: https://yq.aliyun.com/articles/61310
相关文章推荐
- hadoop错误,重新格式化namenode后,出现java.io.IOException Incompatible clusterIDs
- 重新格式化namenode后,出现java.io.IOException Incompatible clusterIDs
- Hadoop格式化namenode错误:java.io.IOException: Cannot create directory
- Hadoop: HDFS 格式化时,出现 “ERROR namenode.NameNode: java.io.IOException: Cannot create directory /usr/hadoop/tmp/dfs/name/current”
- 启动hadoop时,namenode无法启动,log中出现:java.io.IOException: NameNode is not formatted
- 5、Hadoop datanode无法启动,报错: java.io.IOException: Incompatible clusterIDs
- java.io.IOException: Incompatible namespaceIDs in /usr/hadoop/tmp/dfs/data: namenode namespaceID = 6
- hadoop配置新节点后,出现 org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Incompatible n
- hadoop启动jobtracker时错误java.io.IOException: Incompatible namespaceIDs的解决方法
- bug:datanode启动后立刻死掉:java.io.IOException: Incompatible clusterIDs in....
- hadoop文件操作错误---org.apache.hadoop.ipc.RemoteException(java.io.IOException)
- 问题记录:hadoop集群提交job时出现Exception in thread "main" java.io.IOException: Error opening job jar:
- Hadoop问题:启动hadoop时报namenode未初始化:java.io.IOException: NameNode is not formatted.
- 遇到问题---Hadoop---java.io.IOException: Incompatible namespaceIDs
- hadoop错误java.io.IOException Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try
- 遇到问题---Hadoop---java.io.IOException: Incompatible namespaceIDs
- Hadoop Problem : org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Incompatible
- Hadoop2.2.0 中错误总结之(org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /test._COPYING)
- hive执行query语句时提示错误:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException:
- hadoop hdfs 读写错误解决:java.io.IOException: Filesystem closed org.apache.hadoop.hdfs.DFSClient.checkOpe