loading...
Cover image for Setup Continuos Integration/Delivery system in just 4 steps with Jenkins Pipelines and Blue Ocean

Setup Continuos Integration/Delivery system in just 4 steps with Jenkins Pipelines and Blue Ocean

jalogut profile image Juan Alonso Updated on ・4 min read

Now that the first stable version of Blue Ocean Plugin was released, it is easier and cooler than ever to setup your Continuous Integration/Delivery with Jenkins. Jenkins has also evolved quite a lot in recent years, making easier to setup your automation server and builds in just a couple of steps.

First of all, let’s clarify some of the concepts that we will be using in this tutorial:

Multibranch Pipelines: Pipelines, and especially multibranch pipelines, are a game changer in Jenkins. Thanks to this plugin, you can simply setup your repository url and Jenkins will identify all your branches. It will also automatically start new builds as soon as new commits are pushed, so no more webhooks or cumbersome configuration is needed.

Jenkinsfile: Together with Pipelines the new concept of Jenkinsfile was introduced. This is a file that you create in your repo, which contains your pipeline configuration. Jenkins will then look for this file in your branches and execute the build according to the stages defined in there. That makes possible to have your pipeline configuration together with your project and under version control.

Blue Ocean: This plugin is must for everyone using jenkins pipelines. Blue Ocean it is an opensource plugin that rethinks the user experience of Jenkins. Most amazing feature is the beautiful user interface of Pipelines, allowing for fast and intuitive comprehension of build's status. If the awful user experience of Jenkins was holding you back from using it, there are not more reasons for that.

Ok, so now that these concepts are clear, let's see how we can get our Continuous Delivery environment up and running in just 4 steps:

1. Jenkins installation:

You can follow the installation steps for your OS or Docker in the official docs:

For this post, let's assume that you are using Ubuntu. You can log in into your system as sudo and execute the following commands:

wget -q -O - https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo apt-key add -
sudo sh -c 'echo deb http://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list'
sudo apt-get update
sudo apt-get install jenkins

This automatically creates a jenkins user and daemon that listens on port 8080. That means that your Jenkins system is now reachable on http://<your_ip>:8080. In case that you are testing that locally, that would be http://127.0.0.1:8080

If you do not know your server public IP, you can execute ifconfig to get this information.

Open the url and follow the installation wizard. Here we recommend that you to select the default option to install the suggested plugins.

Installation Wizard

2. Intall Plugins

  • Go to Manage Jenkins > Manage Plugins > Available and filter by Blue Ocean.
  • Select and Install
  • If you use Bitbucket for your repository, you must also install Bitbucket Branch Source Plugin

Plugins

3. Jenkinsfile

  • In your project repository root, create a new file called Jenkinsfile. This file is written in Groovy and will define your Pipeline Stages configuration. For this tutorial, just copy the following dummy example and paste it into the Jenkinsfile just created.
  • Save it, commit and push.
node {
    // Clean workspace before doing anything
    deleteDir()

    try {
        stage ('Clone') {
            checkout scm
        }
        stage ('Build') {
            sh "echo 'shell scripts to build project...'"
        }
        stage ('Tests') {
            parallel 'static': {
                sh "echo 'shell scripts to run static tests...'"
            },
            'unit': {
                sh "echo 'shell scripts to run unit tests...'"
            },
            'integration': {
                sh "echo 'shell scripts to run integration tests...'"
            }
        }
        stage ('Deploy') {
            sh "echo 'shell scripts to deploy to server...'"
        }
    } catch (err) {
        currentBuild.result = 'FAILED'
        throw err
    }
}

4. Setup Multibranch Pipeline

  • Go to the jenkins home page and click on new Item. Give a name to your Job and select Multibranch Pipeline.

    • if you use Bitbucket, you must select Bitbucket Team/Project option instead.
  • After that, we need to configure the repository url and credentials. You can see examples of that in the following screenshots depending on your source control system (Bitbucket, github, git):

    • Very important here is that you select Jenkinsfile as Build Configuration Mode.

    Repository Configuration

  • Click save. You will notice that Jenkins starts scanning your repo in search for all your branches. In fact, what Jenkins is doing, it is to look for Jenkinsfiles inside your branches.

  • Click on the header Blue Ocean button.

  • et voilà ! you just setup a Continuous Delivery system that will automatically scan, build and deploy all your branches in your repository.

  • Now click around on the Blue Ocean pipelines branches to see how awesome it looks.

As you might have noticed, our Jenkinsfile is just defining the stages and printing some echos. The idea is that you replace these echos with your actual scripts. You can see a real example of what we are doing for our Magento 2 builds at the following link:

You can also find more info about Jenkinsfile options and syntax here:

In a follow up post, I will explain how you can go one step further and keep only one pipeline configuration for all your project in a shared repository. Stay tuned!

Posted on by:

jalogut profile

Juan Alonso

@jalogut

Magento 2 dev, trainer and consultant. 4x Magento Certified.

Discussion

markdown guide
 

Would you have more info on how those parallel steps run regarding workspace? Do they (those three sh) run on the same copy (folder) or is there any kind of copying?

 

After reading your post I was able to do some digging and found out some interesting things. Jenkinsfile has a stash/unstash feature and also something called external workspaces; I think this is the way to go regarding "sharing" generated artifacts between stages/steps.

 

Hi, as far as I know parallel tasks are working on same workspace by default. As you mentioned, you could modify this behaviour doing specific actions before or into the parallel stages.

 

Just a word of caution: If you have an existing multi-platform build, running them in parallel within the same Jenkins job would cause the job to aggregate e.g. test results for each job. Blue ocean is very nice, but has pretty much non-existent support for existing plugins other than what comes with it out-of-the-box.

 

Hello Martin,

Thanks for pointing that out. As Blue Ocean is quite new, they are focusing on Pipeline Jobs at the moment and trying to be compatible with all pipeline features. It might be that they do not support other Plugins yet but I think this will change in the future. Plugins can be updated to be compatible with Blue Ocean, so little by little the developers can add this additional extension and not the other way around.

More information here:

 

Great Post! I like the simplicity of Blueocean. Very useful to test.

Just a question: I see in your github example a command called "mg2-builder" is a internal tool?

Thanks!

 

Hi Oscar,

Thanks, glad you like it.

Yes, this "mg2-builder" is an internal tool for now. We will make it open source as we did with magento2-deployment-tool. I just need to find some time for that.

 

The code is so pretty. Reminds me of how much I hate xaml.