Matteo Secco Matteo Secco - 2 months ago 11
Python Question

Opening and closing a large number of files on python

I'm writing a program which organizes my school mark and for every subject I created a file.pck where are saved all the marks of that subject. Since I have to open and pickle.load 10+ files I decided to make 2 functions, files_open():

subj1 = open(subj1_file)
subj1_marks = pickle.load(subj1)
subj2 = open(subj2_file)
subj2marks = pickle.load(subj2)


and file_close():

subj1.close()
subj2.close()


The problem is that I had to make every variable in file_open() global and the function now is too long. I tried to avoid that problem by calling variables like:

file_open.subj1


but it does work and I can't understand why.

Answer

since you just want to open, load and close the file afterwards I would suggest a simple helper function:

def load_marks(filename):
    with open(filename,"rb") as f:  # don't forget to open as binary
         marks = pickle.load(f)
    return marks

Use like this:

subj1_marks = load_marks(subj1_file)

The file is closed when going out of scope of with block, and your data remains accessible even if the file is closed which may be your (unjustified) concern with your question.

Note: someone suggested that what you really want (maybe) is to save all your data in one big pickle file. In that case, you could create a dictionary containing your data:

d = dict()
d["mark1"] = subj1_marks
d["mark2"] = subj2_marks
...

and perform one sole pickle.dump() and pickle.load() on the dictionary (if data is picklable then a dictionary of this data is also picklable): that would be simpler to handle 1 big file than a lot of them, knowning that you need all of them anyway.