Fleur Fleur - 27 days ago 17
Java Question

Task not Serializable - Spark Java

I'm getting the Task not serializable error in Spark. I've searched and tried to use a static function as suggested in some posts, but it still gives the same error.

Code is as below:

public class Rating implements Serializable {
private SparkSession spark;
private SparkConf sparkConf;
private JavaSparkContext jsc;
private static Function<String, Rating> mapFunc;

public Rating() {
mapFunc = new Function<String, Rating>() {
public Rating call(String str) {
return Rating.parseRating(str);
}
};
}

public void runProcedure() {
sparkConf = new SparkConf().setAppName("Filter Example").setMaster("local");
jsc = new JavaSparkContext(sparkConf);
SparkSession spark = SparkSession.builder().master("local").appName("Word Count")
.config("spark.some.config.option", "some-value").getOrCreate();

JavaRDD<Rating> ratingsRDD = spark.read().textFile("sample_movielens_ratings.txt")
.javaRDD()
.map(mapFunc);
}

public static void main(String[] args) {
Rating newRating = new Rating();
newRating.runProcedure();
}
}


The error gives:
enter image description here

How do I solve this error?
Thanks in advance.

Answer

Clearly Rating cannot be Serializable, because it contains references to Spark structures (i.e. SparkSession, SparkConf, etc...) as attributes.

The problem here is in the statement

JavaRDD<Rating> ratingsRD = spark.read().textFile("sample_movielens_ratings.txt")
            .javaRDD()
            .map(mapFunc);

If you look at the definition of mapFunc, you're returning a Rating object.

mapFunc = new Function<String, Rating>() {
    public Rating call(String str) {
        return Rating.parseRating(str);
    }
};

This function is used inside a map (a transformation in Spark dialect). Beacuse the transformations are executed directly into the worker nodes and not in the driver node, their code must be serializable. This forces Spark to try serialize the Rating class.

But it is not possible.

Try to extract the features you need from Rating, and placing them in a different class that not owns any Spark structure. Finally, use this new class as return type of the mapFunc function.

Hope it helps.

Comments