0

I have function, which in perfect world should create an huge 1M lines file. Here it is:

    const fileWriteStream = fs.createWriteStream(path.resolve(filePath));
    let ableToWrite = true;
    for (let i = 0; i < 1e6; i++) {
        if (ableToWrite) {
            ableToWrite = fileWriteStream.write(`.testClass${itr}-${i%2} { background: red } \n`);
        } else {
            fileWriteStream.once('drain', () => {
                ableToWrite = fileWriteStream.write(`.testClass${itr}-${i%2} { background: red } \n`);
            })
        }
    }

Unfortunetly for me, i'm getting the following error pretty fast:

MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 drain listeners added. Use emitter.setMaxListeners() to increase limit

I don't really want to increase listeners count for this function. What is correct way to write such a big file using streams?

Many thanks!

4
  • You're going to need another means of flow control: a recursive function, generator, etc. Commented May 21, 2018 at 20:36
  • 1
    Do you understand why your current code adds 99999 listeners? Commented May 21, 2018 at 21:19
  • @Bergi To be honest not really. I know that i have limit for subscribtions, but it looks like this code only add new watchers and watchers which has already been added not work at all Commented May 22, 2018 at 7:29
  • 1
    @YafimDziuko The problem is that it doesn't wait after adding the listeners. So in the following iterations (ableToWrite still being false) it continues to add all the listeners - everything before even the first event fires. And when it fires, it fires all the listeners waiting for it. Commented May 22, 2018 at 9:13

1 Answer 1

5

The easiest way to asynchronously continue when the stream drains is to use await in the loop with a promise:

const fileWriteStream = fs.createWriteStream(path.resolve(filePath));
for (let i = 0; i < 1e6; i++) {
    const ableToWrite = fileWriteStream.write(`.testClass${itr}-${i%2} { background: red } \n`);
    if (!ableToWrite) {
        await new Promise(resolve => {
            fileWriteStream.once('drain', resolve);
        });
    }
}

The alternative is using a recursive approach instead of a loop:

function go(i) {
    if (i >= 1e6) return;
    const ableToWrite = fileWriteStream.write(`.testClass${itr}-${i%2} { background: red } \n`);
    if (ableToWrite)
        go(i+1);
    else
        fileWriteStream.once('drain', () => {
            go(i+1);
        });
}
const fileWriteStream = fs.createWriteStream(path.resolve(filePath));
go(0);
Sign up to request clarification or add additional context in comments.

4 Comments

would await new Promise(fileWriteStream.once.bind(fileWriteStream, 'drain')) work? Not sure if that's a good idea since that might implicitly do something with reject, but I've never tested that.
@PatrickRoberts I guess it would work as once ignores a third argument, but I still don't think it's a good idea because the arrow function is simply much more readable.
@Bergi First solution works really fine, thx! But can you please explain one thing - until new Promise wont be resolved, for loop will be paused? Looks like it shoud, because line order in file seems correct
@Bergi Tested it, await new Promise really pausing for loop untill it won't be resolved. Wow! Big thanks for your help

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.