Skip to main content
deleted 5 characters in body
Source Link
sdgfsdh
  • 195
  • 1
  • 5

I am building a system that consists of multiple programs on many machines, some cloud services (such as RDS) and so on.

In an ideal world, I would like to supply some configuration (e.g. deployment keys, AWS credentials) and run a single "deploy all" command that will build and deploy everything.

I would also like it to be smart enough to not rebuild artefacts that have already been built or redeploy infrastructure that already exists.

Currently, I am using Bazel to build my artefacts (.so, .jar, Docker images, etc) and Terraform to provision my architecture (ECs, RDS, etc.).

Each of these tools is very good at what it does, and together, they cover builds and deployments. However, neither does everything (the desired "deploy all" command) and there are cases where they must interact in awkward ways.

For example, suppose I have a microservice written in JavaScript. This is compiled / bundled by Bazel. The bundle is then included in a Docker image along with some secrets generated by Terraform. The Docker image is built by Bazel. Finally, The Docker image is then deployed using Terraform!

  1. Bazel builds the application code
  2. Terraform generates / fetchfetches secrets
  3. Bazel builds a Docker image
  4. Terraform deploys the Docker image

I am jumping between the two tools and it doesn't feel like the right way to approach this.

  • Should I wrap Terraform in Bazel and only interact with Bazel?
  • Should I wrap Bazel in Terraform and only interact with Terraform?
  • Should I use some third tool to manage them?
  • How can I resolve this?

I am building a system that consists of multiple programs on many machines, some cloud services (such as RDS) and so on.

In an ideal world, I would like to supply some configuration (e.g. deployment keys, AWS credentials) and run a single "deploy all" command that will build and deploy everything.

I would also like it to be smart enough to not rebuild artefacts that have already been built or redeploy infrastructure that already exists.

Currently, I am using Bazel to build my artefacts (.so, .jar, Docker images, etc) and Terraform to provision my architecture (ECs, RDS, etc.).

Each of these tools is very good at what it does, and together, they cover builds and deployments. However, neither does everything (the desired "deploy all" command) and there are cases where they must interact in awkward ways.

For example, suppose I have a microservice written in JavaScript. This is compiled / bundled by Bazel. The bundle is then included in a Docker image along with some secrets generated by Terraform. The Docker image is built by Bazel. Finally, The Docker image is then deployed using Terraform!

  1. Bazel builds the application code
  2. Terraform generates / fetch secrets
  3. Bazel builds a Docker image
  4. Terraform deploys the Docker image

I am jumping between the two tools and it doesn't feel like the right way to approach this.

  • Should I wrap Terraform in Bazel and only interact with Bazel?
  • Should I wrap Bazel in Terraform and only interact with Terraform?
  • Should I use some third tool to manage them?
  • How can I resolve this?

I am building a system that consists of multiple programs on many machines, some cloud services (such as RDS) and so on.

In an ideal world, I would like to supply some configuration (e.g. deployment keys, AWS credentials) and run a single "deploy all" command that will build and deploy everything.

I would also like it to be smart enough to not rebuild artefacts that have already been built or redeploy infrastructure that already exists.

Currently, I am using Bazel to build my artefacts (.so, .jar, Docker images, etc) and Terraform to provision my architecture (ECs, RDS, etc.).

Each of these tools is very good at what it does, and together, they cover builds and deployments. However, neither does everything (the desired "deploy all" command) and there are cases where they must interact in awkward ways.

For example, suppose I have a microservice written in JavaScript. This is compiled / bundled by Bazel. The bundle is then included in a Docker image along with some secrets generated by Terraform. The Docker image is built by Bazel. Finally, The Docker image is deployed using Terraform!

  1. Bazel builds the application code
  2. Terraform generates / fetches secrets
  3. Bazel builds a Docker image
  4. Terraform deploys the Docker image

I am jumping between the two tools and it doesn't feel like the right way to approach this.

  • Should I wrap Terraform in Bazel and only interact with Bazel?
  • Should I wrap Bazel in Terraform and only interact with Terraform?
  • Should I use some third tool to manage them?
  • How can I resolve this?
added 261 characters in body; edited title
Source Link
sdgfsdh
  • 195
  • 1
  • 5

Should I build artefacts in my Blurred lines between deployment system (Terraform) or deploy things from myand build-system processes (Bazel)? leading to an awkward build and release process

I am building a system that consists of multiple programs on many machines, some cloud services (such as RDS) and so on.

In an ideal world, I would like to supply some configuration (e.g. deployment keys, AWS credentials) and run a single "deploy all" command that will build and deploy everything.

I would also like it to be smart enough to not rebuild artefacts that have already been built or redeploy infrastructure that already exists.

Currently, I am using BazelBazel to build my artefacts (.so, .jar, Docker images, etc) and TerraformTerraform to provision my architecture (ECs, RDS, etc.).

Each of these tools is very good at what it does, and together, they cover builds and deployments. However, neither does everything (the desired "deploy all" command) and there are cases where they must interact in awkward ways.

For example, a service mightsuppose I have the following builda microservice written in JavaScript. This is compiled / provision steps:bundled by Bazel. The bundle is then included in a Docker image along with some secrets generated by Terraform. The Docker image is built by Bazel. Finally, The Docker image is then deployed using Terraform!

  • bazel build //:microservice
  • bazel test //:microservice
  • bazel build //:microservice-docker
  • packer build ./packer-template.json
  • terraform apply
  1. Bazel builds the application code
  2. Terraform generates / fetch secrets
  3. Bazel builds a Docker image
  4. Terraform deploys the Docker image

My questions:I am jumping between the two tools and it doesn't feel like the right way to approach this.

  • Is it a good idea to callShould I wrap Terraform fromin Bazel, or does this lead to problems and only interact with Bazel?
  • Is it a good idea to callShould I wrap Bazel fromin Terraform, or does this lead to problems and only interact with Terraform?
  • Is there another way of gluing these together that works wellShould I use some third tool to manage them?
  • Is there some third tool that does everythingHow can I needresolve this?

Should I build artefacts in my deployment system (Terraform) or deploy things from my build-system (Bazel)?

I am building a system that consists of multiple programs on many machines, some cloud services (such as RDS) and so on.

In an ideal world, I would like to supply some configuration (e.g. deployment keys, AWS credentials) and run a single "deploy all" command that will build and deploy everything.

I would also like it to be smart enough to not rebuild artefacts that have already been built or redeploy infrastructure that already exists.

Currently, I am using Bazel to build my artefacts (.so, .jar, Docker images, etc) and Terraform to provision my architecture (ECs, RDS, etc.).

Each of these tools is very good at what it does, and together, they cover builds and deployments. However, neither does everything (the desired "deploy all" command).

For example, a service might have the following build / provision steps:

  • bazel build //:microservice
  • bazel test //:microservice
  • bazel build //:microservice-docker
  • packer build ./packer-template.json
  • terraform apply

My questions:

  • Is it a good idea to call Terraform from Bazel, or does this lead to problems?
  • Is it a good idea to call Bazel from Terraform, or does this lead to problems?
  • Is there another way of gluing these together that works well?
  • Is there some third tool that does everything I need?

Blurred lines between deployment (Terraform) and build processes (Bazel) leading to an awkward build and release process

I am building a system that consists of multiple programs on many machines, some cloud services (such as RDS) and so on.

In an ideal world, I would like to supply some configuration (e.g. deployment keys, AWS credentials) and run a single "deploy all" command that will build and deploy everything.

I would also like it to be smart enough to not rebuild artefacts that have already been built or redeploy infrastructure that already exists.

Currently, I am using Bazel to build my artefacts (.so, .jar, Docker images, etc) and Terraform to provision my architecture (ECs, RDS, etc.).

Each of these tools is very good at what it does, and together, they cover builds and deployments. However, neither does everything (the desired "deploy all" command) and there are cases where they must interact in awkward ways.

For example, suppose I have a microservice written in JavaScript. This is compiled / bundled by Bazel. The bundle is then included in a Docker image along with some secrets generated by Terraform. The Docker image is built by Bazel. Finally, The Docker image is then deployed using Terraform!

  1. Bazel builds the application code
  2. Terraform generates / fetch secrets
  3. Bazel builds a Docker image
  4. Terraform deploys the Docker image

I am jumping between the two tools and it doesn't feel like the right way to approach this.

  • Should I wrap Terraform in Bazel and only interact with Bazel?
  • Should I wrap Bazel in Terraform and only interact with Terraform?
  • Should I use some third tool to manage them?
  • How can I resolve this?
added 86 characters in body
Source Link
sdgfsdh
  • 195
  • 1
  • 5

I am building a system that consists of multiple programs on many machines, some cloud services (such as RDS) and so on.

In an ideal world, I would like to supply some configuration (e.g. deployment keys, AWS credentials) and run a single "deploy all" command that will build and deploy everything.

I would also like it to be smart enough to not rebuild artefacts that have already been built or redeploy infrastructure that already exists.

Currently, I am using Bazel to build my artefacts (.so, .jar, Docker images, etc) and Terraform to provision my architecture (ECs, RDS, etc.).

Each of these tools is very good at what it does, and together, they cover builds and deployments. However, neither does everything (the desired "deploy all" command).

OneFor example of where this gets awkward is pulling artefacts out of the Bazel build folder to use in Terraform, and having to remember toa service might have the following build everything first./ provision steps:

  • bazel build //:microservice
  • bazel test //:microservice
  • bazel build //:microservice-docker
  • packer build ./packer-template.json
  • terraform apply

My questions:

  • Is it a good idea to call Terraform from Bazel, or does this lead to problems?
  • Is it a good idea to call Bazel from Terraform, or does this lead to problems?
  • Is there another way of gluing these together that works well?
  • Is there some third tool that does everything I need?

I am building a system that consists of multiple programs on many machines, some cloud services (such as RDS) and so on.

In an ideal world, I would like to supply some configuration (e.g. deployment keys, AWS credentials) and run a single "deploy all" command that will build and deploy everything.

I would also like it to be smart enough to not rebuild artefacts that have already been built or redeploy infrastructure that already exists.

Currently, I am using Bazel to build my artefacts (.so, .jar, Docker images, etc) and Terraform to provision my architecture (ECs, RDS, etc.).

Each of these tools is very good at what it does, and together, they cover builds and deployments. However, neither does everything (the desired "deploy all" command).

One example of where this gets awkward is pulling artefacts out of the Bazel build folder to use in Terraform, and having to remember to build everything first.

My questions:

  • Is it a good idea to call Terraform from Bazel, or does this lead to problems?
  • Is it a good idea to call Bazel from Terraform, or does this lead to problems?
  • Is there another way of gluing these together that works well?
  • Is there some third tool that does everything I need?

I am building a system that consists of multiple programs on many machines, some cloud services (such as RDS) and so on.

In an ideal world, I would like to supply some configuration (e.g. deployment keys, AWS credentials) and run a single "deploy all" command that will build and deploy everything.

I would also like it to be smart enough to not rebuild artefacts that have already been built or redeploy infrastructure that already exists.

Currently, I am using Bazel to build my artefacts (.so, .jar, Docker images, etc) and Terraform to provision my architecture (ECs, RDS, etc.).

Each of these tools is very good at what it does, and together, they cover builds and deployments. However, neither does everything (the desired "deploy all" command).

For example, a service might have the following build / provision steps:

  • bazel build //:microservice
  • bazel test //:microservice
  • bazel build //:microservice-docker
  • packer build ./packer-template.json
  • terraform apply

My questions:

  • Is it a good idea to call Terraform from Bazel, or does this lead to problems?
  • Is it a good idea to call Bazel from Terraform, or does this lead to problems?
  • Is there another way of gluing these together that works well?
  • Is there some third tool that does everything I need?
Source Link
sdgfsdh
  • 195
  • 1
  • 5
Loading