akcorp2003 akcorp2003 -4 years ago 237
Node.js Question

NodeJS + Electron - Optimizing Displaying Large Files

I'm trying to read large files. Currently, I'm following the NodeJS documentation on how to read the large files but when I read a somewhat large file (~1.1 MB, ~20k lines), my Electron app freezes up for about 6 minutes and then the app finishes loading all the lines.

Here's my current code

var fileContents = document.getElementById("fileContents")
//first clear out the existing text
fileContents.innerHTML = ""
if(fs.existsSync(pathToFile)){
const fileLine = readline.createInterface({
input: fs.createReadStream(pathToFile)
})

fileLine.on('line', (line) => {
fileContents.innerHTML += line + "\n"
})


} else {
fileContents.innerHTML += fileNotFound + "\n"
console.log('Could not find file!!')
}


And the tag I'm targeting is a
<xmp>
tag.

What are some ways that people have displayed large files?

Answer Source

Streams can often be useful for high performance as they allow you to process one line at a time without loading the whole file into memory.

In this case however, you are loading each line and then concatenating onto your existing string (fileContents.innerHTML) with +=. All that concatenating is likely to be slower than just loading the whole contents of the file as one string. Worse still, you are outputting HTML every time you read in a line. So with 20k lines you are asking the rendering engine to render HTML 20,000 times!

Instead, try reading in the file as one string, and outputting the HTML just once.

fs.readFile(pathToFile, (err, data) => {
  if (err) throw err;
  fileContents.innerHTML = data;
});
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download