Per my experience, it sounds like there are some large blobs which size exceed the memory size of your local machine, because the function get_blob_to_bytes you used will temporarily read blob content to memory for writing later.
So please use the other function get_blob_to_stream instead of get_blob_to_bytes.
Here is my sample code, my virtual environment is based on Python 3.7 with Azure Storage SDK via pip install azure-storage-blob==1.5.0.
from azure.storage.blob.baseblobservice import BaseBlobService
account_name = '<your account name>'
account_key = '<your account key>'
blob_service = BaseBlobService(account_name, account_key)
blob_container_name = '<your container name>'
prefix = '<your blob prefix>'
blob_files_names = blob_service.list_blob_names(container_name=blob_container_name, prefix=prefix)
# from io import BytesIO
from io import FileIO
trg_path = '<your target file path>'
# with open(trg_path, 'wb') as file:
with FileIO(trg_path, 'wb') as file:
for blob_file_name in blob_files_names:
#blob = blob_service.get_blob_to_bytes(container_name=blob_container_name, blob_name=blob_file_name)
print(blob_file_name)
# stream = BytesIO()
blob_service.get_blob_to_stream(container_name=blob_container_name, blob_name=blob_file_name, stream=file)
# file.write(stream.getbuffer())
Note: the getbuffer() of the stream of BytesIO above will not create a copy of the values in the BytesIO buffer and will hence not consume large amounts of memory.