6

I'm trying to upload very large files in chunks to a server. The server side works great. In the function I've come up with, it tries to upload all the files at once. I'm new to async/await and am unsure how to proceed. The goal is to just upload one at a time in chunks.

Note that I've stripped this out of a class, updated for clarity, and remove all the unnecessary bits.

Using node-fetch to add fetch().

Thanks for any help or advice :)

let tree = []

const start = () => {
    scan('/foo/bar', err => { 
        tree.forEach(file => {
            upload(file)
        })
    })
}

const scan = dir => { ... } // assigns array of files to tree

const upload = file => {
    const stream = fs.createReadStream(file) 
    stream.on('data', async chunk => {
        stream.pause()
        try {
            let response = await fetch('https://some/endpoint', {
                method: 'post',
                body: JSON.stringify({
                    file: file,
                    chunk: chunk.toString('base64')
                }) 
            })
            stream.resume()
        } catch(e) {
            // handle errors
        }
    })
    stream.on('end', () => {
       console.log('end', file)
    })
    stream.on('error', err => {
        console.log('error', err)
    })
}

2
  • Just a note: This question is not really about async/await. It is more about streams. That is a very specialized theme. Maybe you might consider using a library for this? There are many. Like for example npmjs.com/package/formidable Commented Feb 11, 2022 at 1:05
  • 2
    Well, won't learn anything by letting a library do the heavy lifting ... but thanks for the reply :) Commented Feb 11, 2022 at 15:39

1 Answer 1

6

Array.prototype.forEach runs synchronously until all elements of an array have been processed, discarding any value returned by the argument function it calls.

To upload one file at a time, try using await (inside an asynchronous function) to wait for each file to upload, in combination with returning a promise from the upload process that becomes fulfilled when an upload has been completed. Conceptually (with no graceful error handling) it might look like

const uploadTree = async tree => {
    for( file of tree) {
         await uploadFile( file); // process files one at a time
    }
}

// wrap the existing upload code in a promise executor:

const uploadFile = file => new Promise( (resolve, reject) => {
     const stream = fs.createReadStream(file)
     stream.on('data' ...
        ...
     } catch(e) {
        console.log( "Error uploading chunk of %s", file)
        reject(e) // handle errors
     }
     stream.on('end', resolve)
     stream.on('error', e=> {
        console.log(" Error reading %s", file)
        reject (e)
     })
});

// upload complete tree array files:

uploadTree(tree)
.then( ()=> console.log( "tree uploaded successfully"));
.catch( err=> console.log("An upload failed: ", err));
Sign up to request clarification or add additional context in comments.

1 Comment

Thank you! Not only is this an elegant solution, I learned something. Kudos++

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.