venuktan venuktan - 4 months ago 40
Scala Question

read files recursively from sub directories with spark from s3 or local filesystem

I am trying to read files from a directory which contains many sub directories. The data is in S3 and I am trying to do this:

val rdd =sc.newAPIHadoopFile(data_loc,

this does not seem to work.

Appreciate the help


yes it works, took a while to get the individual blocks/splits though , basically a specific directory in every sub directory : s3n://bucket/root_dir/*/data/*/*/*