您现在的位置是:首页 > 博客日记 > Java Java

hadoop的常见报错日志以及解决方案

2019-09-16 23:54:21

问题1

  1. WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

解决方案

去除警告,vim hadoop/etc/hadoop/log4j.properties
添加 log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR

问题2

  1. Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[192.168.1.125:50010,DS-c41f5f60-fb7e-4afd-814e-d4ee05623630,DISK], DatanodeInfoWithStorage[192.168.1.1

解决方案

修改 hdfs-site.xml

  1. <property>
  2. <name>dfs.client.block.write.replace-datanode-on-failure.policy</name>
  3. <value>NEVER</value>
  4. </property>
  5. <property>
  6. <name>dfs.client.block.write.replace-datanode-on-failure.enable</name>
  7. <value>true</value>
  8. </property>

问题3

下载文件错误1 :java 调用hadoop API 报错 java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.

解决方案

将对应的hadoop.gz包在windows中解压缩,然后配置环境变量 HADOOP_HOME E:\hadoop-3.2.0\hadoop-3.2.0重启eclipse

问题4

下载文件错误2: java.io.FileNotFoundException: Could not locate Hadoop executable: E:\hadoop-3.2.0\hadoop-3.2.0\bin\winutils.exe
解决方案
需要在windows下解压一份windows的安装包 将winutils.exe 放到 bin目录下 再次运行 解决问题

运行mapreduce 程序的时候 报错
错误: 找不到或无法加载主类org.apache.hadoop.mapreduce.v2.app.MRAppMaster

解决方案

在命令行输入hadoop classpath 得到路径
添加 yarn-site.xml

  1. <property>
  2. <name>yarn.application.classpath</name>
  3. <value>/usr/local/webserver/hadoop-3.2.0/etc/hadoop,/usr/local/webserver/hadoop-3.2.0/share/hadoop/common/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/common/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/hdfs,/usr/local/webserver/hadoop-3.2.0/share/hadoop/hdfs/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/hdfs/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/mapreduce/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/mapreduce/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/yarn,/usr/local/webserver/hadoop-3.2.0/share/hadoop/yarn/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/yarn/*
  4. </value>
  5. </property>

问题5

org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete错误是 hadoop 处于安全状态不能删除

解决方案

  1. hadoop dfsadmin -safemode leave


关注TinyMeng博客,更多精彩分享,敬请期待!