user7183174 user7183174 - 1 year ago 75
Perl Question

reverse the way readdir works (last file first)

I have a directory which contains n files, all matching the pattern

through which I go with a
while (my $file = readdir ($DIR) ) { ... }
to avoid reading the whole directory to RAM at once.
In this loop I count the files (excluding directories and files not named accordingly) and pick out five files, starting at a specified point, and read the first two lines of each file into an array.

My actually two questions are:

  1. Is it possible to start the loop at the last file in the directory, starting at the file with the largest filename and going "back"?
    (I guess every OS would return these files in order 0000 -> 0001 -> 0002 ...)

  2. Since this would be way easier, how bad would it be to just read the whole directory in an array in comparison to going through the whole list one by one?

Full code block:

while ( my $f = readdir ($DIR) ){

print "checking " . $f . "\n";
next unless -f ($dir . "/" . $f);
next unless $f =~ /^\d\d\d\d$/;


if ( $c >= $index && $c <= $index + 4 ){
open (my $ITM, "<", "$dir/$f") or die "Opening file $f in $dir failed:$!\n";
my $headline = <$ITM>; my $first_p = <$ITM>;
close $ITM;

chomp ($headline, $first_p);
push (@content, $headline, $first_p);

print $f . " was checked succesfully!\n";

Thanks in advance

Answer Source
  1. File systems generally return directory entries in an unpredictable order. They won't be sorted alphabetically. (Except possibly on Windows? Not sure.)

  2. Even if you have a million files in that directory, that's just a "mega". So if each name takes 10 bytes to store, that's only 10 MB of RAM. OK, there's some overhead in Perl's internal data structures, but you're almost certainly going to have fewer names than a million. So I'd say it's not bad at all.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download