I have for example 10 repositories on gitlab. With an e-book I can trigger a Jenkins job. Right now I have one Jenkins pipeline job, which will use the jenkinsfile from a repository. When another repository will trigger this job, always the configured repository will be used. How can I configure Jenkins this way, that I only have one or maybe two Jenkins jobs, that can handle multiple repositories, but will only build this repository which triggered the Jenkins job.
1 Answer
You are asking for CI/CD anti-pattern. There are multiple problems you have to be aware of when thinking about sharing the same Jenkins job between multiple repositories e.g.
- you can have only single build at a time (imagine that single job takes 5 minutes in average, 10 repositories push a change inside the same 20-minutes time window, 10th repository push has to wait until all previous jobs are completed, and in the meantime new commits are being pushed and your build queue grows and grows)
- you have a single history of previous builds (let's say you have 200 builds in a history and 5th repository was used only in a few builds - how would you find it?)
- you get false sense of incremental development (imagine that 8 or 9 of your repositories always passes all tests and leaves a green pipeline and only a few, 1 or 2 fails from time to time leaving your pipeline red - when you look at the history of previous builds, how do you know which repository is broken? You don't see that 8 or 9 repositories are constantly in a good shape)
- you can't customize behavior per repository (this is obvious in this case)
Instead it's better to use Jenkins' Pipeline as a Code approach where each repository has its own Jenkinsfile and for each repository you have dedicated job that is pointing to repository's Jenkinsfile. You can build in parallel all repositories, you have a clean history for each repository and if needed - you can define a custom steps for each pipeline.
Taking advantage of Jenkins Libraries
If you want to specify common base for all of your repositories and you want to have it in a single place, consider extending your pipeline with shared libraries. What does it mean in practice? You define a library script as described in documentation, e.g.
vars/standardBuild.groovy
// See https://github.com/jenkinsci/workflow-cps-global-lib-plugin
// The call(body) method in any file in workflowLibs.git/vars is exposed as a
// method with the same name as the file.
def call(body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
node {
stage('checkout') {
checkout scm
}
stage('main') {
docker.image(config.environment).inside {
sh config.mainScript
}
}
stage('post') {
sh config.postScript
}
}
}
And then your Jenkinsfile uses standardBuild function (it uses same name as a script name):
#!groovy
// Loads the standardBuild function/step from workflowLibs.git/vars/standardBuild.groovy
// and invokes it.
standardBuild {
environment = 'golang:1.5.0'
mainScript = '''
go version
go build -v hello-world.go
'''
postScript = '''
ls -l
./hello-world
'''
Source: https://github.com/jenkinsci/pipeline-examples/tree/master/global-library-examples/global-function
This approach allows you to share common behavior between all jobs and still gives you a free hand when custom specific stuff has to be implemented in a single pipeline.
Conclusion - don't follow anti-patterns. It may sound like good solution (having single job for multiple repositories requires less effort in theory), but it's not. It causes multiple problems, stops you from scaling and makes your CI/CD pipeline useless when it comes to receiving fast feedback from your CI server.