我有hadoop 1.2.1,我在单节点上安装了hive 0.14.0
$ hive Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-0.14.0.jar!/hive-log4j.properties Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:444) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:672) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:160) Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:529) at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:478) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:430) ... 7 more
HDFS上的root scratch dir:/ tmp/hive应该是可写的.当前权限是:rwxrwxr-x.
我用hadoop fs -chmod g+w /tmp/hive
但不工作.
使用以下命令更新/ tmp/hive HDFS目录的权限
hadoop fs -chmod 777 /tmp/hive
如果是这样,你可以在本地和hdfs上删除/ tmp/hive.
hadoop fs -rm -r /tmp/hive; rm -rf /tmp/hive
只有临时文件保留在此位置.即使我们删除它也没问题,将在需要时使用适当的权限创建.