我有一个全新的hadoop纱线安装,我已经通过给定的jar文件执行了wordcount示例hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples...
但是当我尝试编译wordcount源并运行它时,它给了我java.io.IOException: No FileSystem for scheme: hdfs
.
上面的例外与这行代码有关:
FileInputFormat.addInputPath(job, new Path(args[0]));
编辑:命令和输出是这些:
hduser@master-virtual-machine:~$ hadoop jar Desktop/NativeWordcount.jar /tin /tout SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [rsrc:org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:rsrc:slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 13/12/03 07:14:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Exception in thread "main" java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.io.IOException: No FileSystem for scheme: hdfs at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2421) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2428) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:166) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:351) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:287) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:466) at WordCount.main(WordCount.java:55) ... 10 more
pgroce.. 7
我今天也遇到了这个问题.您需要确保hadoop-hdfs jar在您的类路径中.
我的第一个刷子是简单地hadoop-hdfs
在Maven中的包上添加一个依赖项,但这还不够.最后,我遵循了Cloudera的建议并添加了依赖hadoop-client
.您pom.xml
文件的相关条款是:
org.apache.hadoop hadoop-client VERSION
当我在Leiningen的Clojure中这样做时,我将其添加到我的project.clj
文件中:
(defproject ; ... :dependencies [[org.apache.hadoop/hadoop-client "VERSION"] ; ... ])
(当然,您的版本将取决于您系统上安装的内容.目前2.x系列中唯一的发行版本是2.2.0
.)
我今天也遇到了这个问题.您需要确保hadoop-hdfs jar在您的类路径中.
我的第一个刷子是简单地hadoop-hdfs
在Maven中的包上添加一个依赖项,但这还不够.最后,我遵循了Cloudera的建议并添加了依赖hadoop-client
.您pom.xml
文件的相关条款是:
<dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>VERSION</version> </dependency>
当我在Leiningen的Clojure中这样做时,我将其添加到我的project.clj
文件中:
(defproject ; ... :dependencies [[org.apache.hadoop/hadoop-client "VERSION"] ; ... ])
(当然,您的版本将取决于您系统上安装的内容.目前2.x系列中唯一的发行版本是2.2.0
.)