J. Won. J. Won. - 10 months ago 51
R Question

Recursively ftp download, then extract gz files

I have a multiple-step file download process I would like to do within R. I have got the middle step, but not the first and third...

# STEP 1 Recursively find all the files at an ftp site
# ftp://prism.oregonstate.edu//pub/prism/pacisl/grids
all_paths <- #### a recursive listing of the ftp path contents??? ####

# STEP 2 Choose all the ones whose filename starts with "hi"
all_files <- sapply(sapply(strsplit(all_paths, "/"), rev), "[", 1)
hawaii_log <- substr(all_files, 1, 2) == "hi"
hi_paths <- all_paths[hawaii_log]
hi_files <- all_files[hawaii_log]

# STEP 3 Download & extract from gz format into a single directory
mapply(download.file, url = hi_paths, destfile = hi_files)
## and now how to extract from gz format?

Answer Source

I can read the contents of the ftp page if I start R with the internet2 option. I.e.

C:\Program Files\R\R-2.12\bin\x64\Rgui.exe --internet2

(The shortcut to start R on Windows can be modified to add the internet2 argument - right-click /Properties /Target, or just run that at the command line - and obvious on GNU/Linux).

The text on that page can be read like this:

 download.file("ftp://prism.oregonstate.edu//pub/prism/pacisl/grids", "f.txt")
 txt <- readLines("f.txt")

It's a little more work to parse out the Directory listings, then read them recursively for the underlying files.

## (something like)
dirlines <- txt[grep("Directory <A HREF=", txt)]

## split and extract text after "grids/"
split1 <- sapply(strsplit(dirlines, "grids/"), function(x) rev(x)[1])

## split and extract remaining text after "/"
sapply(strsplit(split1, "/"), function(x) x[1])
[1] "dem"    "ppt"    "tdmean" "tmax"   "tmin"  

It's about here that this stops seeming very attractive, and gets a bit laborious so I would actually recommend a different option. There would no doubt be a better solution perhaps with RCurl, and I would recommend learning to use and ftp client for you and your user. Command line ftp, anonymous logins, and mget all works pretty easily.

The internet2 option was explained for a similar ftp site here: