0

I need to upload large (2-20 GB) videos to a streaming service without loading them into the ram.The code i wrote is working but i don't have the resources to handle large files. Is there a way in python to that?

def upload(file):
    files = {'file1': open(file, 'rb',0)}
    r = requests.post(url,files=files)
    data = r.json()

    print(data["msg"])
    return data["result"]["id"]

Their API says i have to POST the file and it shall be multipart/form-data encoded Example with curl: curl -F file1=@/path/to/file.txt https://www.example.com/uls/jAZUhVzeU78

4
  • Sorry, my initial comment was wrong, you are using the files parameter. The documentation covers this: In the event you are posting a very large file as a multipart/form-data request, you may want to stream the request. By default, requests does not support this, but there is a separate package which does - requests-toolbelt. You should read the toolbelt’s documentation for more details about how to use it. Commented Jul 31, 2019 at 11:23
  • The new duplicate target addresses this too. Commented Jul 31, 2019 at 11:24
  • note that when sending things that large you should look for an API that allows you to resume transfers. the chance of network failure goes up as you transfer more data, as does the cost of retry. once you can resume transfers, you might be able to upload in, e.g., 10MB chunks which would obviate the original problem Commented Jul 31, 2019 at 11:37
  • I tried the methods described in toolbelt's documentation, but that uses data= not files= so the files won't upload even though it starts and in the end the response is 200 "OK".I'm using verystream.com's API by the way. Commented Jul 31, 2019 at 12:20

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.