blue-sky blue-sky - 1 year ago 80
Scala Question

reduceByKey method not being found in Scala Spark

Attempting to run from source.

This line

val wordCounts = textFile.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => a + b)
is throwing error

value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, Int)]
val wordCounts = logData.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => a + b)

logData.flatMap(line => line.split(" ")).map(word => (word, 1))
returns a MappedRDD but I cannot find this type in

I'm running this code from Spark source so could be a classpath problem ? But required dependencies are on my classpath.

Answer Source

You should import the implicit conversions from SparkContext:

import org.apache.spark.SparkContext._

They use the 'pimp up my library' pattern to add methods to RDD's of specific types. If curious, see SparkContext:1296

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download