1

I have automated a process to pull XML files from multiple devices into my box, within the locations of c:\scripts\googlebucket\uploads

My task is to upload all the files within this directory to a Google Cloud Bucket. A credentials JSON file was also provided to access the Google Cloud location.

I have been playing around with uploading singular files with success, but I'm struggling to figure out how I can dump all files within a specific directory (All files within c:\scripts\googlebucket\uploads to google cloud bucket).

Here is my script that I've been launching to push the singular file:

from google.cloud import storage
client = storage.Client.from_service_account_json(json_credentials_path='Service_Account.json')
bucket = client.get_bucket('us-client-uploads')
blob = bucket.blob(f"Closed_24.02.2021.08.24.xml")
blob.upload_from_filename(f"Closed_24.02.2021.08.24.xml")

Would anyone happen to have any suggestions? Much appreciated.

0

1 Answer 1

3

As far as I know the Google Storage Python library doesn't provide anything helping with that so you'll need to iterate over the files in the directory in Python and use the upload_from_filename method you're already using, something like that:

from os import listdir
from os.path import isfile, join
folder = '<your folder>'
files = [f for f in listdir(folder) if isfile(join(folder, f))]
for file in files:
    local_file = folder + file
    blob = bucket.blob(file)
    blob.upload_from_filename(local_file)
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.