venuktan venuktan - 1 year ago 101
Scala Question

read files recursively from sub directories with spark from s3 or local filesystem

I am trying to read files from a directory which contains many sub directories. The data is in S3 and I am trying to do this:

val rdd =sc.newAPIHadoopFile(data_loc,

this does not seem to work.

Appreciate the help

Answer Source

yes it works, took a while to get the individual blocks/splits though , basically a specific directory in every sub directory : s3n://bucket/root_dir/*/data/*/*/*

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download