RetroCode RetroCode - 2 months ago 12
Python Question

Python Lazy Loading

The following code is going to lazily print the contents of the text file line by line, with each print stopping at '/n' .

with open('eggs.txt', 'rb') as file:
for line in file:
print line

Is there any configuration to lazily print the contents of a text file, with each print stopping at ', ' ?

(or any other character/string )

I am asking this because I am trying to read a file which contains one single 2.9 GB long line separated by commas.

PS. My question is different than this one: Read large text files in Python, line by line without loading it in to memory
I am asking how to do the stopping at characters other than newlines ('\n')


I don't think there is a built-in way to achieve this. You will have to use to read the file block by block, split each block at commas, and rejoin strings that go across block boundaries manually.

Note that you still might run out of memory if you don't encounter a comma for a long time. (The same problem applies to reading a file line by line, when encountering a very long line.)

Here's an example implementation:

def split_file(file, sep=",", block_size=16384):
    last_fragment = ""
    while True:
        block =
        if not block:
        block_fragments = iter(block.split(sep))
        last_fragment += next(block_fragments)
        for fragment in block_fragments:
            yield last_fragment
            last_fragment = fragment
    yield last_fragment