I have a django app on Cloud Run and I'd like to create an endpoint that will be called by another python script. This endpoint should save files to Google storage. The file size is 800Mb max.
So when I try to do this I receive: 413 Request Entity Too Large.
So from digging the internet I understood that I should use chunk file. But there is something I do not understand..
From this: https://github.com/django/daphne/issues/126 I understand that daphne is now able to receive large body in request. So, I thought that, even receiving a big file Django was managing to chunk it and send it piece by piece.
I am curious, Is there anyway to do what I want other than doing manual chunk ?
For now I added this to my settings:
GS_BLOB_CHUNK_SIZE = 524288
DATA_UPLOAD_MAX_MEMORY_SIZE = 26214400
FILE_UPLOAD_MAX_MEMORY_SIZE = 26214400
and I simply use generics.ListCreateAPIView with default value for file upload handler.