DEV Community

Cover image for Pragmatic Approach to Moving from Jenkins Pipelines to DevOps Pipelines
Wayne Greene
Wayne Greene

Posted on

Pragmatic Approach to Moving from Jenkins Pipelines to DevOps Pipelines

Contemplating moving from a Jenkins pipeline or the Jenkins Blue Ocean pipeline to something completely different? Is your colleague encouraging you to ditch Jenkins? 80% of most development teams have multiple Jenkins and are struggling to find a path forward to an improved model/tool/architecture. Why is that? This blog explains the reason why and how to move forward WITH your Jenkins and how to get to an end-to-end DevOps Pipeline.

The journey from custom scripts to Jenkins, to the Blue Ocean plugin

In the beginning, there were custom scripts, and all was good. Jenkins was a great start towards Continuous Integration (CI) over the past decade and has been a workhorse for development teams. Developers could support just about any use paradigm or corner case they wanted. Jenkins was and is today ubiquitous across teams small to large. Whether a product or service is a legacy app or a new cloud-native microservice, Jenkins was always there to assist. Whether you were a developer, or in this new role of DevOps, or QA/Test, or you were in charge of deploying code - Continuous Deployment (CD) to production, Jenkins pretty much enabled you to accomplish your automation.

At some point in time, your process expanded to multiple steps and you integrated a wide variety of different tools. Jenkins helped you achieve a Jenkins Pipeline. With well over 100,000 instances of Jenkins, there was a large community of developers who you could depend on. You could even take your skills and drive a mile further down the road and get a job doing similar in slightly nicer digs.

Inevitably Jenkins was so popular everybody wanted one. And guess what, they got it. I have heard of some organizations with over 1000 Jenkins instances. Each one doing exactly what its owner wanted. Just imagine the potential nightmare of pipelines crossing instances, owners, and teams.

While building automation with Jenkins pipelines that fit right into your coding-centric point of view, many developers wanted a visual representation of the pipeline. Blue Ocean was an attempt at that and was born over 5 years ago.

“Blue Ocean is a project that rethinks the user experience of Jenkins, modeling and presenting the process of software delivery by surfacing information that's important to development teams with as few clicks as possible, while still staying true to the extensibility that is core to Jenkins.” - https://www.jenkins.io/blog/2016/05/26/introducing-blue-ocean/

From the Jenkins Blue Ocean plugin to DevOps pipelines

Five and half years later and we still don’t have an easy to use highly functional graphical pipeline in Jenkins. There are many issues with Jenkins that make an inbuilt graphical pipeline that covers CI/CD a tall task. Look at this blog from Mark Hornbeek. The developer, test, and release management communities have moved toward a DevOps pipeline, that supports the major principles of DevOps. In the multi-year gap across e Blue Ocean, the need developed for a DevOps pipeline with new and improved constructs. This has led to the rise of other tools to solve for the CI, Continuous Test (CT), and CD. The rise of ReleaseIQ, CircleCI, Harness, Spinnaker, Atlassian BitBucket, and Bamboo pipelines, GitLab, and Microsoft tooling has begun to fill some of these gaps.

Build a new DevOps pipeline for your business unit with 2 easy purchase orders!

Say what? Yes, you heard me, this is what many shops are hearing (being told). Start over, get rid of your trusted old friend Jenkins, and rip and replace it with a new tool. Hey, it only takes 30 days to migrate! In 30 days your competitors just ate your lunch, four times over. Who has the time, resources, and opportunity cost to do this? Did the cloud-native/microservices/K8s triad replace all the existing development paradigms and platforms out there? You see where this is going. Over time we can make major changes in architecture and implementation, but it takes time and a pragmatic approach to piecewise migration. Many hops from rock to rock across the DevOps Pipeline river chasm.

Rip and replace?

I recently spoke to a Fortune 20 software leader and they were adamant that the best teams just retire the old stuff as quickly as possible. In my younger days at HP, I whacked the enterprise management toolset from 500 tools to 150. It took four years and reduced funding/layoffs alchemy to get the “buy-in” to do this. Rip and replace in the real world doesn’t happen that often. Ripping off the bandaid can transform an organization in ways that Agile methods can’t really fix. Few organizations can really make wholesale changes in tools because they are inexorably intertwined with both the process and the culture.

Benefits of modern DevOps pipeline tools

  • Speed - Yes Jenkins is very extensible, with the plugin community you might find a solution, you might not. When the pipeline moves as fast as the code is changing, many times relying on Jenkins may not give you the speed you need. Remember, Jenkins was designed for on-premise traditional applications and architectures. New pipeline automation tools can indeed provide the agility for very fast configuration changes to a new pipeline architecture or tooling.

  • Reduced maintenance - Hands down, Jenkins is notable for its very high maintenance requirements. This is a great example of free open source software having the hidden aspects of maintainability. Supporting and maintaining your own infrastructure, Jenkins applications, plug-ins, and other aspects make Jenkins very sticky. Once you get it working it works well. Make a change and you will be dealing with issues until you can lock it down again.

  • Plugin sprawl - No doubt about it, pluggable architectures are a key design win for any systems. However, Jenkins takes that pluggability very far. A typical pipeline can touch a dozen plug-ins. Like stone balancing, who wants to maintain that, or rely on that when you have a mission-critical hotfix. Modern pipeline tools reduce the complexity of the installation through native support.

  • Debugging - Visibility when something goes wrong in your pipeline is a must-have. While all the logs may be present, the modern tools provide greatly increased visibility when the inevitable happens.

  • Notification - Getting notified (Slack or other) when something is amiss is critical in the always-on world. When you get those Jenkins plugins dialed in that is good. But what about when you have different tools for CI, vs. CT, vs. CD. How does that all work together?

  • Built-in analytics and dashboards - Jenkins is an automation framework, obtaining the analytics layer and visualization in Jenkins is hard enough as it is. Now consider gaining that visualization across multiple tools.

  • Tribal knowledge and excessive customization - I have bumped into a few shops that had very little idea how their CI pipeline worked or what it actually did. The people who built that heavily customized solution are long gone. Sound familiar?

  • High availability and scale-out - It is important that your team own your code, services, and architectures. On top of that larger installation needs to support increased compute grids with scale-out and HA. Add even more time to supporting Jenkins at scale. Modern SaaS tools handle this for you.

  • Manage, coordinate and orchestrate across multiple Jenkins - How many complex applications use just a single Jenkins for CI, running tests, and CD? In addition, teams typically have different Jenkins for different purposes, one for Dev, one for Production, etc. Different teams from development, test, and DevOps have different Jenkins. How can you see across all of this? This is where modern CI tools shine in bridging or orchestrating the gaps and providing the visibility modern teams need.

Pragmatic approach to moving from Jenkins pipelines to DevOps pipelines

Back to the original discussion of do you rip and replace your multi-year investment in Jenkins tooling, plugins, infrastructure, and processes running fairly well in your environment? You may have seen or been exposed to those vendors who ask you to replace that environment with their shiny new automation framework. Yes, you will get future value, but at what costs the current situation?

A more practical and pragmatic approach is one where you and your team can get end-to-end DevOps pipelines by overlaying a solution like ReleaseIQ on top of your existing tools. Take your complete pipeline tooling: Jenkins, CircleCI, Bamboo, Bitbucket Pipelines, Spinnaker, JFrog and others. Overlay the ReleaseIQ Automation and Orchestration Platform above that. Get all the advantages of orchestrating the end-to-end pipeline, with the unified visibility across all the tools, troubleshooting when something goes wrong, you have a normalized set of analytics and dashboards to improve your DevOps practices. AND THEN, you can decide over time, at the pace you're comfortable with to change the underlying tooling. No need for a discontinuity in your pipelines and processes.

Learn more about the ReleaseIQ Essentials for Jenkins. If you are building multi-tool DevOps pipelines, check out or Premium Edition capabilities.

Top comments (0)