I am trying to download a file from an API to upload (stream) directly into S3.
My code for local downloads (which works perfectly):
import requests
import datetime
import os
headers = {'Authorization': 'apikey THISISHIDDEN'}
baseURL = 'https://api.test.au/busschedule/'
target_path = datetime.datetime.now().strftime('%Y-%m-%d schedule') + '.zip'
response = requests.get(baseURL, stream=True, headers=headers)
handle = open(target_path, "wb")
for chunk in response.iter_content(chunk_size=512):
if chunk: # filter out keep-alive new chunks
handle.write(chunk)
handle.close()
My attempt to download and stream to S3 (which didn't work):
# import requests
import datetime
import os
import boto3
import botocore.vendored.requests.packages.urllib3 as urllib3
# Get environment variables from serverless.yml
bucket = "bucket"
s3folder = "schedules"
# Set standard script parameters
headers = {'Authorization': 'apikey THISISHIDDEN'}
baseURL = 'https://api.test.au/busschedule/'
def run(event, context):
s3 = boto3.client('s3')
datetimestamp = datetime.datetime.today().strftime('%Y%m%dT%H%M%S')
filename = datetimestamp + " bus schedule.zip"
key = s3folder + '/' + filename # your desired s3 path or filename
http = urllib3.PoolManager()
s3.upload_fileobj(http.request('GET', baseURL,
headers=headers, preload_content=False),
bucket, key)
def main():
run({},{})
if __name__ == "__main__":
exit(main())
The error I get returned by CloudWatch is:
InsecureRequestWarning: Unverified HTTPS request is being made. Timeout after 300.10s.
EDIT: The lambda function has a timeout of 300 seconds; but this should be more than long enough to download the file (6mb). Downloading locally is done within 10 or so seconds. Does anyone have a better approach to this?
os.environ['URL']. Any chance something missing in the code here? BTW this code is malformatted, you have an invalid string literal