In my company, I'm currently working on a project with some (REST) web services involved. The development is based on the OpenAPI specification and the Swagger tooling. The target platform is an internal cloud platform based on OpenShift.
As we make use of an CI/CD pipeline in our software development process (more or less the whole Atlassian Toolchain, Artifactory, etc) the question rises whether is possible to neatly integrate a code generation step in the pipeline.
The API development process right now is basically like this:
- Write / Modify API specification
- Commit changes into SCM
- Validate spec and generate API documentation
The process to develop (e.g. the server) against the API spec is like this:
- Local generation / update of server stubs
- Commit into SCM
- Server code is then implemented against the generated and checked in stubs
The question is, whether this is the go to solution to work with a code generator in general. I have the subtle feeling that the local generation of code is not the best solution here, as one have to update the SCM on a regular basis with newly generated stubs to be up to date with changes in the API specification.
One idea that came in to my mind is to make e.g. use of an artifact repository manager (like Artifactory). The build process could then create a server stubs library based on the API spec which is pushed into the artifact manager. Developers can than make use of their local build system to update the dependency. However, this process might be an overkill.
As I could not find good resources about the integration of code generators in a software development process using CI/CD pipelines, I'm very interested in best practices here (or maybe there is a good specific solution for my previously described example?).