2

I made a small server with flask to upload files (and then do stuff with them). The upload is through and HTML form that sends a file:

<form action="/upload" method="POST" enctype="multipart/form-data">
    <input type="file" name="file">
    <input type="submit">
</form>

On the server side I do the following:

@app.route('/upload', methods=['POST'])
def upload():
    if 'file' in request.files:
        f = request.files['file']
        file_path = os.path.join(app.config['UPLOAD_FOLDER'], werkzeug.secure_filename(f.filename))
        f.save(file_path)

    return 'File is being uploaded'

It works fine for small files but on large files, it fails. The problem is that if I run the script manually not through gunicorn python main.py I can upload files that I couldn't before. I thought I needed to change the max size in gunicorn but couldn't find how to do it.

I also thought to use a stream and then write chunks but again, I couldn't find how to access the stream with flask.

Thanks for your help

1
  • "It works fine for small files but on large files, it fails." In what way? Commented Jun 6, 2017 at 10:43

1 Answer 1

1

It might be due to timeout problem. By default, gunicorn will hang if a process takes more than 30 seconds. The time to upload a large file can easily exceed this amount. You might want to specify a custom timeout value when you start gunicorn. For example if you want to set timeout to 300 seconds: exec gunicorn [app]:[app] --timeout 300

Source: gunicorn timeout setting

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.