2

I have a django app on Cloud Run and I'd like to create an endpoint that will be called by another python script. This endpoint should save files to Google storage. The file size is 800Mb max.

So when I try to do this I receive: 413 Request Entity Too Large.

So from digging the internet I understood that I should use chunk file. But there is something I do not understand..

From this: https://github.com/django/daphne/issues/126 I understand that daphne is now able to receive large body in request. So, I thought that, even receiving a big file Django was managing to chunk it and send it piece by piece.

I am curious, Is there anyway to do what I want other than doing manual chunk ?

For now I added this to my settings:

GS_BLOB_CHUNK_SIZE = 524288
DATA_UPLOAD_MAX_MEMORY_SIZE = 26214400
FILE_UPLOAD_MAX_MEMORY_SIZE = 26214400

and I simply use generics.ListCreateAPIView with default value for file upload handler.

2
  • Does this answer your question? Uploading large files with Python/Django Commented Oct 20, 2021 at 19:55
  • I already foudn this post. This does not really help me. First of all I am not using my browser but just a python script calling my API. But I guess I could use django Q and make my upload on a queue. But this does not help me understand why django does not manage it as I thought the documentation was saying Commented Oct 21, 2021 at 7:32

1 Answer 1

3

Generally a 413 error means a size limit in a request has been exceeded. For Cloud Run, the quota for requests is 32mb. According to the documentation, the recommended way of uploading large files is through providing a Signed URL to the Cloud Storage Bucket, since Signed URLs can be used for resumable uploads:

Resumable uploads are the recommended method for uploading large files, because you do not have to restart them from the beginning if there is a network failure while the upload is underway.

You can generate a signed URL from your server backend and use it to upload a file without restrictions from your client side script. There appears to be other related questions in which Django servers in Cloud Run have upload limits, and the use of Signed URLS is recommended to deal with these cases.

Sign up to request clarification or add additional context in comments.

2 Comments

Hey, this seems like a great idea, but my file needs to be saved on my model into a FileField. I think that if I do this operation and then, get the file and save it to my model it will try to update the file on my bucket and the operation will b executed twice, once by my view and once by django (which will cause the 413 error)
According to this related question, it is possible to manually set the name of the FileField without uploading the file, this can help you when the file is already uploaded using a signed URL. Otherwise, if you can alter the model, there is also a URLField which can hold a URL for the location of the uploaded file.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.