7

I setup AWS CodeBuild to write out a large number of artifacts files into S3. The CodeBuild yaml files defines which files and CodeBuild project settings define the S3 bucket. And all works correctly.

It appears that when you trigger AWS CodeBuild from AWS CodePipeline that CodePipeline ignores the artifact settings of AWS CodeBuild and instead forces the artifact into a zip file in a codepipline S3 bucket.

Is there a way to use CodePipeline but have it respect AWS CodeBuild's artifact settings?

4
  • I'm looking for an answer too. Let me know if you figured out a way Commented May 14, 2017 at 22:54
  • Though this was 4 months ago and services change quickly, the quick answer at the time was no, couldn't do it. Had to add into the build script aws-cli to push the files we wanted into S3 directly -- and do that outside of the CodeBuild config. Commented May 16, 2017 at 1:30
  • Ok. Is aws-cli available as part of the codebuild environment? Even I'm planning to use a rest api in the build step to push to S3. aws-cli would be much better option Commented May 16, 2017 at 13:12
  • We went the docker route so it was an extra install. Don't know about the containers that AWS gives though Commented May 17, 2017 at 15:45

1 Answer 1

1

CodeBuild also gives you access to aws-cli.

You can edit the buildspec.yaml file and upload these artifacts to S3. You can also create a .sh file, give it the right execute permissions and then use shell script to upload the artifacts to S3 Bucket.

You will also need to give the right permissions to the S3 Bucket via the Service Role for CodeBuild.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.