(Originally posted on loglevel-blog.com)
In this blog post I will try to explain how to setup and develop a shared pipeline library for Jenkins, that is easy to work on and can be unit tested with JUnit and Mockito.
NOTE: This blog post is kinda long and touches many topics without explaining them in full detail. If you don't feel like following along a lengthy tutorial you can have a look at the complete example library on GitHub. Also if you have questions, or really any kind of feedback on what I could improve, please leave a comment and I will come back to you asap ;) Additionally, if you are completely unfamiliar with Jenkins Shared Libraries, you should probably read about them first in the official docs.
Let's get going!
Basic Development Setup
First, let's create a new IntelliJ IDEA project. I suggest using the IntelliJ IDEA for Jenkins shared pipeline development, because it is the only IDE I know of, that properly supports Java and Groovy and has Gradle support. So, if you don't have it installed it yet, you can download it here for Windows, Linux or MacOS. Also make sure to have the Java Development Kit installed, which is available here.
When everything is ready, start IntelliJ, create a new project, select Gradle and make sure to set the checkbox on Groovy.
Next up, enter a GroupId and an ArtifactId.
Ignore the next window (the defaults are fine), click "Next", enter a project name and click "Finish".
IntelliJ should boot up with your new project. The folder structure in your project should be something like the following.
This is cool for usual Java/Groovy projects, but for our purpose we have to change things up a bit since Jenkins demands a project structure like this:
(root)
+- src # Groovy source files
| +- org
| +- somecompany
| +- Bar.groovy # for org.foo.Bar class
+- vars
| +- foo.groovy # for global 'foo' variable
| +- foo.txt # help for 'foo' variable
+- resources # resource files (external libraries only)
| +- org
| +- somecompany
| +- bar.json # static helper data for org.foo.Bar
So:
- add a
vars
folder to your project root folder - add a
resource
folder to your project root folder - delete all files/folders inside
src
and add a new package likeorg.somecompany
- edit the
build.gradle
file:
group 'somecompany'
version '1.0-SNAPSHOT'
apply plugin: 'groovy'
apply plugin: 'java'
sourceCompatibility = 1.8
repositories {
mavenCentral()
}
dependencies {
compile 'org.codehaus.groovy:groovy-all:2.3.11'
testCompile group: 'junit', name: 'junit', version: '4.12'
testCompile "org.mockito:mockito-core:2.+"
}
sourceSets {
main {
java {
srcDirs = []
}
groovy {
// all code files will be in either of the folders
srcDirs = ['src', 'vars']
}
}
test {
java {
srcDir 'test'
}
}
}
After saving, import your changes to the Gradle project:
At this point our project has the right structure to be used as a shared library by Jenkins. But, as you might have seen in the code snippet above, we also added a source directory for unit tests called test
. Now is the time to create this folder at the root level of the project and add a package org.somecompany
like we did with src
. The final structure should look like the following.
Cool, it's time to implement our shared library!
The General Approach
First a quick run-down on how we build our library and on why we do it that way:
- We will keep the "custom" steps inside
var
as simple as possible and without any real logic. Instead we create classes (insidesrc
) that do all the work. - We create an interface, which declares methods for all required Jenkins steps (
sh
,bat
,error
, etc.). The classes call steps only through this interface. - We write unit tests for your classes like you normally would using JUnit and Mockito.
This way we are able to:
- Compile and execute our library/unit tests without Jenkins
- Test that our classes work as intended
- Test that Jenkins steps are called with the right parameters
- Test the behaviour of our code when a Jenkins step fails
- Build, test, run metrics and deploy your Jenkins Pipeline Library through Jenkins itself
Now let's get really going.
The Interface For Step Access
First, we will create the interface inside org.somecompany
that will be used by all classes to access the regular Jenkins steps like sh
or error
.
package org.somecompany
interface IStepExecutor {
int sh(String command)
void error(String message)
// add more methods for respective steps if needed
}
This interface is neat, because it can be mocked inside our unit tests. That way our classes become independent to Jenkins itself. For now, let's add an implementation that will be used in our vars
Groovy scripts:
package org.somecompany
class StepExecutor implements IStepExecutor {
// this will be provided by the vars script and
// let's us access Jenkins steps
private _steps
StepExecutor(steps) {
this._steps = steps
}
@Override
int sh(String command) {
this._steps.sh returnStatus: true, script: "${command}"
}
@Override
void error(String message) {
this._steps.error(message)
}
}
Adding Basic Dependency Injection
Because we don't want to use the above implementation in our unit tests, we will setup some basic dependency injection in order to swap the above implementation with a mock during unit tests. If you are not familiar with dependency injection, you should probably read up about it, since explaining it here would be out-of-scope, but you might be fine with just copy-pasting the code in this chapter and follow along.
So, first we create the org.somecompany.ioc
package and add an IContext
interface:
package org.somecompany.ioc
import org.somecompany.IStepExecutor
interface IContext {
IStepExecutor getStepExecutor()
}
Again, this interface will be mocked for our unit tests. But for regular execution of our library we still need an default implementation:
package org.somecompany.ioc
import org.somecompany.IStepExecutor
import org.somecompany.StepExecutor
class DefaultContext implements IContext, Serializable {
// the same as in the StepExecutor class
private _steps
DefaultContext(steps) {
this._steps = steps
}
@Override
IStepExecutor getStepExecutor() {
return new StepExecutor(this._steps)
}
}
To finish up our basic dependency injection setup, let's add a "context registry" that is used to store the current context (DefaultContext
during normal execution and a Mockito mock of IContext
during unit tests):
package org.somecompany.ioc
class ContextRegistry implements Serializable {
private static IContext _context
static void registerContext(IContext context) {
_context = context
}
static void registerDefaultContext(Object steps) {
_context = new DefaultContext(steps)
}
static IContext getContext() {
return _context
}
}
That's it! Now we are free to code testable Jenkins steps inside vars
.
Coding A Custom Jenkins Step
Let's imagine for our example here, that we want to add a step to our library that calls the .NET build tool "MSBuild" in order to build .NET projects. To do this we first add a groovy script ex_msbuild.groovy
to the vars
folder that is called like our custom step we want to implement. Since our script is called ex_msbuild.groovy
our step will later be callable with ex_mbsbuild
in our Jenkinsfile. Add the following content to the script for now:
def call(String solutionPath) {
// TODO
}
According to our general idea we want to keep our ex_msbuild
script as simple as possible and do all the work inside a unit-testable class. So let's create a new class MsBuild
in a new package org.somecompany.build
:
package org.somecompany.build
import org.somecompany.IStepExecutor
import org.somecompany.ioc.ContextRegistry
class MsBuild implements Serializable {
private String _solutionPath
MsBuild(String solutionPath) {
_solutionPath = solutionPath
}
void build() {
IStepExecutor steps = ContextRegistry.getContext().getStepExecutor()
int returnStatus = steps.sh("echo \"building ${this._solutionPath}...\"")
if (returnStatus != 0) {
steps.error("Some error")
}
}
}
As you can see, we use both the sh
and error
steps in our class, but instead of using them directly, we use the ContextRegistry
to get an instance of IStepExecutor
to call Jenkins steps with that. This way, we can swap out the context when we want to unit test the build()
method later.
Now we can finish our ex_msbuild
script:
import org.somecompany.build.MsBuild
import org.somecompany.ioc.ContextRegistry
def call(String solutionPath) {
ContextRegistry.registerDefaultContext(this)
def msbuild = new MsBuild(solutionPath)
msbuild.build()
}
First, we set the context with the context registry. Since we are not in a unit test, we use the default context. The this
we pass into registerDefaultContext()
will be saved by the DefaultContext
inside its private _steps
variable and is used to access Jenkins steps. After registering the context, we are free to instantiate our MsBuild
class and call the build()
method doing all the work.
Nice, our vars
script is finished. Now we only have to write some unit tests for our MsBuild
class.
Adding Unit Tests
At this point writing unit tests should be business as usual. We create a new test class MsBuildTest
inside the test folder with package org.somecompany.build
. Before every test, we use Mockito to mock the IContext
and IStepExecutor
interfaces and register the mocked context. Then we can simply create a new MsBuild
instance in our test and verify the behaviour of our build()
method. The full test class with two example test:
package org.somecompany.build;
import org.somecompany.IStepExecutor;
import org.somecompany.ioc.ContextRegistry;
import org.somecompany.ioc.IContext;
import org.junit.Before;
import org.junit.Test;
import static org.mockito.Mockito.*;
/**
* Example test class
*/
public class MsBuildTest {
private IContext _context;
private IStepExecutor _steps;
@Before
public void setup() {
_context = mock(IContext.class);
_steps = mock(IStepExecutor.class);
when(_context.getStepExecutor()).thenReturn(_steps);
ContextRegistry.registerContext(_context);
}
@Test
public void build_callsShStep() {
// prepare
String solutionPath = "some/path/to.sln";
MsBuild build = new MsBuild(solutionPath);
// execute
build.build();
// verify
verify(_steps).sh(anyString());
}
@Test
public void build_shStepReturnsStatusNotEqualsZero_callsErrorStep() {
// prepare
String solutionPath = "some/path/to.sln";
MsBuild build = new MsBuild(solutionPath);
when(_steps.sh(anyString())).thenReturn(-1);
// execute
build.build();
// verify
verify(_steps).error(anyString());
}
}
You can use the green play buttons on left of the IntelliJ code editor to run the tests, which hopefully turn green.
Wrapping Things Up
That's basically it. Now it's time to setup your library with Jenkins, create a new job and run a Jenkinsfile to test your new custom ex_msbuild
step. A simple test Jenkinsfile could look like this:
// add the following line and replace necessary values if you are not loading the library implicitly
// @Library('my-library@master') _
pipeline {
agent any
stages {
stage('build') {
steps {
ex_msbuild 'some/path/to.sln'
}
}
}
}
Obviously there is still a lot more I could have talked about (things like unit tests, dependency injection, Gradle, Jenkins configuration, build and testing the library with Jenkins itself etc.), but I wanted to keep this already very long blog post somewhat concise. I do hope however, that the general idea and approach became clear and helps you in creating a unit-testable shared library, that is more robust and easier to work on than it normally would be.
One last piece of advice: The unit tests and Gradle setup are pretty nice and help a ton in easing the development of robust shared pipelines, but unfortunately there is still quite a bit that can go wrong inside your pipelines even though the library tests are green. Things like the following, that mostly happen because of Jenkins' Groovy and sandbox weirdness:
- a class that does not implement
Serializable
which is necessary, because "pipelines must survive Jenkins restarts" - using classes like
java.io.File
inside your library, which is prohibited - Syntax and spelling errors in your Jenkinsfile
Therefore it might be a good idea to have Jenkins instance solely for integration testing, where new and modified vars
scripts can be tested before going "live".
Again, feel free to write any kind of questions or feedback in the comments and take a look at the completed, working example library on GitHub.
Top comments (32)
This is a great article. My only criticism is that I wish I would have discovered it a couple of months ago before I coded absolutely everything in vars. I'm now in the middle of a massive refactor and it's all your fault. :-)
Uh oh, I've done exactly the same thing because i really can't wrap my head around the java package structure, i'm not a java guy; it makes a lot of sense to me to just have everything in
/vars
as single groovy scripts and call them as steps. How did the refactor go? Was it worth it?It was absolutely worth it (for me) because I'm a big proponent of unit testing. It just bugged me to have a huge repository of "untested" code that my whole enterprise is depending on. Our shared library has something like 80% unit test coverage right now (post refactor) and a deployment pipeline that makes sure that it doesn't regress.
Of course, your mileage will vary . I'm certainly not going to judge you if you're comfortable having all of your code in vars.
I agree completely in theory; i was just saying to the bosses today that
Is there a general pattern that develops when doing a refactor like this?
Whenever I start feeling like I'm some sort of jenkins guru, I come across an article like this that puts me in my place.
😅 Thanks for your feedback 👍
Thanks for the article Adrian!
Nice implementation. One piece of feedback, in the test case, it is accepting any string as valid input. I was wondering if it would be better if the test case asserted the specific string.
verify(_steps).sh(
echo \"building some/path/to.sln...\"
)Good catch, it probably would. Looks like I was a little bit lazy at the time of writing and just decided to with
anyString()
;) Thanks for the feedbackJust what I'm looking for!
I have a question: do you configure the library as global or non-global? I ran into some access control issue with Script Security while setting this up as a non-global so I'm assuming it should be global?
If that's the case then probably should be mentioned in the article :)
Interesting. I indeed only tested this with a global library. I will look into this as soon as I have some spare time. If you would like to help you can open an issue on Github.
Thanks for the feedback :)
Edit: Finally got around to do some testing, but could not reproduce your problems. The example Jenkinsfile worked for me with both a global and a folder specific library (which per default is untrusted and runs inside the groovy sandbox). Maybe you added some non-whitelisted classes, which caused your access control issues?
Hi, thanks for getting back :)
The error I get was on calling the static method of
ContextRegistry
though. I'm not entirely sure to be honest, could be some weird versioning thing or something else. I'll try to open an issue when I get around to test it further.When I create a class I pass in the workflow script context in order to use steps.
This allows me to use the pipeline steps in my class. Did you avoid this because its hard to test?
I personally feel like it might be a lot of technical debt to add every pipeline step I use to that interface you are using. I probably use 40-100 different steps in our company.
Jep, the main reason is testability. Outside of the Jenkins pipeline (e.g. during regular unit tests), the steps can't be called that easily. The interface therefore allows for mocking of the step calls. Adding all of them (or at least the used ones) to the interface is busy-work for sure, but not technical debt in my opinion. In my experience the most used steps by far are either
bat
orsh
, which means that theIStepExecutor
interface should not be that big.But obviously its different in your case and having to mock 40-100 steps would mean a lot of boilerplate that no one really wants to write. You must decide for yourself, if the advantage of unit tests is worth the effort. Alternatively you could have a look at this blog post, which details a different approach to testing Jenkins Shared Libraries using jenkins-spock (no
IStepExecutor
interface needed 😉).Thanks for the article Adrian, I have a question -
Since we are keeping the vars scripts as clean as possible do you know of any examples of how to perform more complex operations within the groovy "testable" sources?
Your example MsBuild.class is a good start but I'm struggling to find examples of an approach for performing git checkouts , builds etc. I'm new to JSL so I quite possibly have overestimated what can be wrapped in these classes.
We have tons of repeated code in Jenkins files and I have been tasked with implementing shared libraries. If I can do this create testable code that would be very cool...
Many thanks.
Hi, thanks for your comment :D Let me try to answer some questions:
cargo build
. The latter could look somewhat like this inside the libraryInside your Jenkinsfiles instead of writing
cargo build
each time, you could just writebuild
(if your var script is calledbuild.groovy
of course). This is a very basic example (and only saves a single word), but in a real world example, the library can streamline the build process inside your organization and abstract away some of the more complicated command-line calls (e.g. by setting specific build parameters by default, which in turn can be omitted inside the Jenkinsfile).ex_build.groovy
,ex_test.groovy
,ex_metrics.groovy
andex_deploy_artifacts.groovy
. Any Jenkinsfile, which before might have been very long, filled with strange parameters, paths and commands, could (more or less) be reduced to this:(of course, this is still simplified).
Imo, the whole challenge of writing Jenkins shared libraries is collecting and wrapping all command-line calls necessary for your builds and make fitting abstractions that simplify your Jenkinsfiles and streamline your organizations build process. In the beginning this can be very difficult. (To be honest, before I was tasked in setting up Jenkins pipelines, I almost had no clue, which parts were involved in building our software. For me "a build" was just pressing the green play button in Visual Studio 🙈).
So, in your case, I would probably take a long and good look at your current Jenkinsfiles, write down which tools (e.g. MSBuild, cargo, npm, dotnet, maven, gradle) are used and how (parameters passed to the tools, which commands repeat between Jenkinsfiles, etc.). As soon as you have a good understanding of the current build process, you can start to extract the command-line calls of these tools into your library by creating fitting var scripts (e.g.
ex_build.groovy
) which in turn call a unit-tested "build" class. This class simply wraps the command-line call to the build tool (withsteps.sh("<the command>");
or if you are using windows-based build agentssteps.bat("<the command>");
). That's basically it.I hope this wall of text clears things up a bit 😅 If not, don't be shy to ask any new questions 👍
Wow.. What a fantastic reply. You do make perfect sense. I will be following your advice tomorrow. Starting by trawling through our jenkinsfiles and many many bash scripts to see what we can DRY (so to speak)
Thank you for expanding with more examples and so much detail. It's also nice to hear you started from a similar place.
Many thanks again
Regards
Patrick
I don't hear that very often 😄 Glad I could help
Great Article! But one thing I could not wrap my Head around, how do you mock
steps.node
? I can't figure it out on how to test mysh
command which is inside of anode
...You are using a Scripted Pipeline, right? Not sure how you could mock the
node
step (or any step that uses closures for that matter) as I'm only familiar with Declarative Pipelines. Sorry 🙈Keep in mind that the goal of this article is to test your Jenkins Pipeline Library. Not the pipeline itself (aka the Jenkinsfile).
Thank you for the reply.
Yes I'm using the Scripted Pipeline, but I use my SL to do some things on nodes. Anyway, I figured it out just now.
The following is my current code to mock
node("<name>") {}
:My problem was to figure out what signature the method node had. This was
caused by the fact, that I did not know, that
node("<name>") {}
is thesame as
node("<name>", {})
.So for anyone wondering how to mock the closure methods, this is the way.
@kuperadrian maybe you could include this in your article to help anyone who is in need of mocking a closure :)
Hi Adrian
Thank you very much for the article.
I'm maintaining a shared library that I'd like to test more. I like your solution very much.
Some of my vars use the openshift client plugin. github.com/openshift/jenkins-clien...
I was looking for a way to mock the calls to the openshift variable, but didn't find a way how to do that.
Have you ever tried something like that? / Do you have an idea how to do that?
Hi Adrian,
I hope you're still reading here. I still have a problem of understanding how the steps are mocked and used in your example. I've managed to rework the testing to move to spock and get rid of all that java stuff.
Now the following. I created a ProjectVersionGetter class to encapsulate the fetching of the current version (which in turn is done by calling a gradle task in the project). For simplification and testing I return a 3.0.0.0 hardly using the following class:
The vars file looks like this:
So, I provide a configuration Map where the version should be kept. In my declarative pipeline, I've got a stage like this:
My simple spock test shall verify, that the
projectVersion
is set to the configuration map and - for the matter of simplicity - it should be valued to "3.0.0.0".The key is set, but to
null
. So, the main problem I have is with the lineSo, I added to IStepExecutor and the Implementation a function to cover the return of the stdout to a String, so I can retrieve the version from the "gradle call" (in future):
The implementation is
Can you, or someone point me to the mistake I made?
Thanks a lot,
ferdy
Great article Adrian!
My team has a huge shared lib but we have 0 unit tests. I've been trying to find a way to test our library, I'm going to try and follow your examples.
Thanks a lot.
Cool, glad I could help :D
Thanks for great article, Adrian! It will go to my bookmarks!
I have a huge shared library with zero tests and every run of deployment pipeline is killing small peace of my soul. Hopefully, I'll try to cover it with unit tests.
Thanks for mention Serializable thing, this is pain for beginners.
Jenkins pipelines can be exasperating indeed 🙈 Thanks for your comment 👍
Some comments may only be visible to logged-in visitors. Sign in to view all comments.