使用自制软件安装hadoop后,我遇到了以下问题.我按照指南在这里:
http://glebche.appspot.com/static/hadoop-ecosystem/hadoop-hive-tutorial.html
在bashrc中设置以下环境变量:
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_55.jdk/Contents/Home export HADOOP_INSTALL=/usr/local/Cellar/hadoop/2.3.0 export HADOOP_HOME=$HADOOP_INSTALL export PATH=$PATH:$HADOOP_INSTALL/bin export PATH=$PATH:$HADOOP_INSTALL/sbin export HADOOP_MAPRED_HOME=$HADOOP_INSTALL export HADOOP_COMMON_HOME=$HADOOP_INSTALL export HADOOP_HDFS_HOME=$HADOOP_INSTALL export YARN_HOME=$HADOOP_INSTALL
运行hadoop namenode -format之后..我尝试运行start-dfs.sh并获取以下内容:
14/05/05 21:19:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Starting namenodes on [localhost] localhost: set hadoop variables localhost: starting namenode, logging to /usr/local/Cellar/hadoop/2.3.0/libexec/logs/mynotebook.local.out localhost: Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode localhost: set hadoop variables localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.3.0/libexec/logs/mynotebook.local.out localhost: Error: Could not find or load main class org.apache.hadoop.hdfs.server.datanode.DataNode Starting secondary namenodes [0.0.0.0] 0.0.0.0: set hadoop variables 0.0.0.0: secondarynamenode running as process 12747. Stop it first. 14/05/05 21:19:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
如何解决这个问题?