3

Is it possible to copy all files under root directory/bucket

Example folder structure:

/2016/01/file.json
/2016/02/file.json
/2016/03/file.json
...

I've tried with the following command:

copy mytable
FROM 's3://mybucket/2016/*'
CREDENTIALS 'aws_access_key_id=<>;aws_secret_access_key=<>'
json 's3://mybucket/jsonpaths.json'

2 Answers 2

6

Specify a prefix for the load, and all Amazon S3 objects with that prefix will be loaded (in parallel) into Amazon Redshift.

Examples:

copy mytable
FROM 's3://mybucket/2016/'

will load all objets stored in: mybucket/2016/*

copy mytable
FROM 's3://mybucket/2016/02'

will load all objets stored in: mybucket/2016/02/*

copy mytable
FROM 's3://mybucket/2016/1'

will load all objets stored in: mybucket/2016/1* (eg 10, 11, 12)

Basically, it just makes sure the object starts with the given string (including the full path).

This also means that if you have something like mybucket/wallet and a mybucket/walletiventory it can apply the rule also, so be careful with names when using the COPY command from S3.

Sign up to request clarification or add additional context in comments.

Comments

2

Apparently this is a simple as changing the source url to s3://mybucket/2016/, no wildcards required.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.