user2226388 user2226388 - 4 months ago 97
Scala Question

Spark & Scala - Cannot Filter null Values from RDD

i tried to filter null values from RDD but failed. Here's my code :

val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat],
classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable],
classOf[org.apache.hadoop.hbase.client.Result])

val raw_hbaserdd = hBaseRDD.map{
kv => kv._2
}

val Ratings = raw_hbaseRDD.map {
result => val x = Bytes.toString(result.getValue(Bytes.toBytes("data"),Bytes.toBytes("user")))
val y = Bytes.toString(result.getValue(Bytes.toBytes("data"),Bytes.toBytes("item")))
val z = Bytes.toString(result.getValue(Bytes.toBytes("data"),Bytes.toBytes("rating")))

(x,y, z)
}
Ratings.filter ( x => x._1 != null )

Ratings.foreach(println)


when Debugging, null value still appeared after Filter :

(3359,1494,4)
(null,null,null)
(28574,1542,5)
(null,null,null)
(12062,1219,5)
(14068,1459,3)


any Better idea ?

Answer
Ratings.filter ( x => x._1 != null ) 

this actually transforms the RDD but you are not using that particular RDD. U can try

Ratings.filter(_._1 !=null).foreach(println)