试图从源代码运行http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala.
这一行:
val wordCounts = textFile.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => a + b)
投掷错误
value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, Int)] val wordCounts = logData.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => a + b)
logData.flatMap(line => line.split(" ")).map(word => (word, 1))
返回MappedRDD,但我在http://spark.apache.org/docs/0.9.1/api/core/index.html#org.apache.spark.rdd.RDD中找不到此类型
我从Spark源代码运行此代码,因此可能是类路径问题?但是必需的依赖项在我的类路径上.
您应该从SparkContext
以下位置导入隐式转换:
import org.apache.spark.SparkContext._
他们使用'pimp up my library'模式为RDD的特定类型添加方法.如果好奇,请参阅SparkContext:1296