CC. CC. - 2 months ago 19
Java Question

Read large files in Java

I need the advice from someone who knows Java very well and the memory issues.
I have a large file (something like 1.5GB) and I need to cut this file in many (100 small files for example) smaller files.

I know generally how to do it (using a

BufferedReader
), but I would like to know if you have any advice regarding the memory, or tips how to do it faster.

My file contains text, it is not binary and I have about 20 character per line.

Answer

First, if your file contains binary data, then using BufferedReader would be a big mistake (because you would be converting the data to String, which is unnecessary and could easily corrupt the data); you should use a BufferedInputStream instead. If it's text data and you need to split it along linebreaks, then using BufferedReader is OK (assuming the file contains lines of a sensible length).

Regarding memory, there shouldn't be any problem if you use a decently sized buffer (I'd use at least 1MB to make sure the HD is doing mostly sequential reading and writing).

If speed turns out to be a problem, you could have a look at the java.nio packages - those are supposedly faster than java.io,

Comments