The purpose of this blog post is to give you high level overview on what DevSecOps is and some steps on how security can be integrated in your Azure DevOps pipeline with help of some readily available tasks in Azure DevOps related to some commonly used security scanning tools in build and release pipelines.
If you are reading this article, I’m assuming that you must have encountered these terms – CI and CD by and you do have a fair understanding of them.
Let’s recap on what we mean by each of these terms.
Continuous integration is a process of automating the build and testing the quality of the code when someone in the team commits the code to your source control. This ensures that a particular set of unit tests are run, build compiles successfully, without any issues. In case, if the build fails, the person committing the code should be notified to fix the issues encountered. This is one of the software engineering practices where the feedback on the newly developed code is provided to the developers immediately, with different types of tests.
Continuous Delivery Vs Deployment
Both Continuous Delivery and Deployment are interesting terms. In Continuous Delivery, once the CI is done and the code is integrated in your source code, the ability to deploy the code automatically to various stages of the pipeline seamlessly and making sure that the code is production ready is Continuous Delivery, However in continuous delivery, the code is not deployed to production automatically. A manual intervention is required.
Where as in continuous deployment, Every build or change that is integrated passes all the quality checks, deployment gates and they get deployed from lower environments till Production automatically without any human intervention.
CI & CD helps you deliver the code faster, great!!!
but how about security?
DevSecOps is no longer a buzz word, or maybe it still is, but lot of organizations are shifting gears towards implementing the notion of including security in their Software Development lifecycle.
What is DevSecOps?
Security needs to shift from an afterthought to being evaluated at every step of the process. Securing applications is a continuous process that encompasses secure infrastructure, designing an architecture with layered security, continuous security validation, and monitoring for attacks.
In simple terms, the key focus around DevSecOps is that you need to make sure that the product you are developing is secure right from the time you start coding it and that the best practices of ensuring that security are met at every stage of your pipeline and an ongoing practice. In other words, security should be met as one of the key elements from the initial phase of development cycle, rather than looking at the security aspects at the end of the product sign-off/deployment. This is also called as ‘shift-left’ strategy of security. It’s more of injecting security in your pipeline at each stage.
How do can we achieve security at various stages of the pipeline?
There are multiple stages involved in getting your code deployed to your servers/cloud hosted solutions right from the developers coding it from the pipeline till deploying them.
Let’s now see few of them and how we can achieve integrating security around our pipelines using this.
Precommit hooks/IDE Plugins are usually used to find and remediate issues quickly in the code even before a developer commits the code to the remote repository. Some of the common issues that can be found or eliminated are credentials exposed in code like SQL connection strings, AWS Secret keys, Azure storage account keys, API Keys, etc. When these are found in the early stage of the development cycle, it helps in preventing accidental damage.
There are multiple tools/plugins which are available and can be integrated in a developer’s IDE. A developer still get around these and commit the code bypassing these Precommit hooks. These are just the first line of defense but not meant to be your full-fledged solution for identifying major security vulnerabilities. Some of the Precommit hooks tools include – Git-Secret, Talisman. Some of the IDE plugins include .NET Security Guard, 42Cruch, etc. You can find more about other tools here:
Using secret management for entire code base is one of the best practices. There could be a secret management tool that you can use like an Azure Key Vault, AWS secret manager, HashiCorp vault built into your pipeline already for accessing the secure credentials. The same secret management has to be used by your entire code base and not just the DevOps pipelines.
Software Composition Analysis:
As the name indicates, SCA is all about analyzing the software/code for determining the vulnerable open-source components, third party libraries that your code is dependent on.
In majority of the cases of software development, very less portion of code is written and rest of it is imported/dependent on from external libraries.
SCA focuses on not only determining the vulnerable open source components, but also shows you if there are any outdated components are present in your repo & also highlights issues with opensource licensing. WhiteSource bolt is one of the light weight tools that does scanning of the code integrates with Azure DevOps and shares the vulnerabilities and fixes in a report.
SAST (Static analysis security testing):
While SCA is focused on determining issues related to open source/third party components used in our code, it doesn’t actually analyze the code that is written by us.
This will be done by SAST. Some common issues that can be found are like SQL Injection, Cross-site scripting, insecure libraries, etc. Using these tools needs collaboration with security personnel as the initial reports generated by these reports can be quite intimidating and you may encounter certain false-positives. CheckMarx is one of the SAST tools.
DAST (Dynamic Analysis Security Testing):
Key differences between SAST and DAST is that while vulnerabilities can be determined in the third libraries in our code, it doesn’t actually scan the deployed site itself. There could be some more vulnerabilities which can’t be determined until the application is deployed into one of the lower environments like PreProd by providing target site URL. You can run DAST in a passive or aggressive mode. While passive test runs fairly quick, aggressive tests run for more time.
In general, a manual Pen test/DAST can take longer time. This will be done by a Pentester from security team. A manual test can’t be done every time you check-in the code or deploy the code as pen testing itself would take some amount of time.
I have worked on cloud migrations to Azure & AWS and we usually raise a request for DAST & Pentest test to security team at the last leg of migration lifecycle and get a sign-off from security team after all the identified vulnerabilities are fixed. Usually security team the takes a week or sometimes more than a week for them to complete the report, they run scripts, test data, try to break the application and what not to see if the application we migrated is secure enough. Once the vulnerability report is out, we look at the critical & high issues reported and start working on fixing them. Majority of the times, the deliverable timelines used to get extended based on the amount of work we had to do to remediate the issues raised. With DAST testing using ZAP Auto Scanner task in Azure DevOps, we can identify and fix the issues even before they become a bottle neck later.
And, security just doesn’t mean DAST/ Pentest or code quality alone. The infrastructure you deploy should also be secure. With your environment deployed on Azure, you have Azure Polices/Initiatives that help you govern and put rail guards around your infrastructure by auditing & enforcing the rules you specify. You could enforce polices to make sure that your infrastructure meets your desired state. For example, using Azure Polices, you can enforce use of managed azure disks only, storage accounts are not publicly accessible, Subnets in a particular VNet doesn’t allow Inbound Internet Traffic, SQL Server firewalls doesn’t allow internet traffic, etc. These are just of few of the tasks that you can achieve using Azure Polices. We will take a look at how Azure polices work in another blog post and also enabling effective monitoring and alerting is another key aspect.
Azure DevOps supports integration of multiple open source and licensed tools for scanning your application as a part of your CI & CD process.
In this blog post, we’ll see how to achieve security in our Azure DevOps pipeline using following tools:
- WhiteSource Bolt extension for Scanning Vulnerability for SCA
- Sonarcloud for code quality testing
- OWASP ZAP Scanner for passive DAST testing
Sonarcloud for code quality testing:
Integrating WhiteSource bolt in your pipeline is pretty straight forward. In this blog post, I’m going to use one of the previous repos that I have used in my previous blog posts.
If you would like to follow along, feel free to clone/import it to your Azure DevOps repo and steps are in the previous blog post too.
To install WhiteSource Bolt in your Azure DevOps pipeline, search for “WhiteSource Bolt” from Marketplace and install it. You’ll go through a series of steps to get it installed in your organization.
It’s all straight forward.
I’m jumping straight ahead to the build pipelines, in which we are going to integrate WhiteSource Bolt.
Login to your Azure DevOps and click on Pipelines -> Build Pipelines and edit your build pipeline, you can import the complete project and pipelines from my git repo and the steps are mentioned my previous blog post. please refer to the link above.
Once in the build pipeline, to add the tasks click on “+” icon and search for “WhiteSource bolt” in Marketplace.
Back in your build pipeline, click “+” and add “WhiteSource Bolt” task
Leave the default settings as by default, it would scan your root directory.
Save and kick-off a new build.
In your build pipeline, you can see the logs of the task
In your build pipeline section, you will see that you have a new section for WhiteSource Bolt, you can click on this to view the results after the build pipeline completes the build.
You can also see the results in the build pipeline results and the report tab.
Notice that it not only shows the vulnerabilities, but also shows the fixes for each of them. Note that this has only scanned the third party libraries and open source components in the code but not the deployed code on the target infrastructure.
This can be achieved via DAST testing in release pipeline using ZAP Auto Scanner. We’ll see that as well in this blog post.
Now, let us see how to integrate SonarCloud in Azure DevOps Pipeline. Prior to adding task in Azure DevOps, we need to import our Azure DevOps Project in SonarCloud.
You need Sonarcloud account for integrating it in the pipeline. Login to https://sonarcloud.io/ with your Azure DevOps account and choose your organization.
Select import projects from Azure.
Create a personal access token in Azure DevOps, copy the token and paste it somewhere, we need it later
Back in Sonarcloud site, provide the personal access token to import the projects, choose defaults to continue.
Generate a token in Sonarcloud that will be used in Azure DevOps. Once logged in SonarCloud, go to My Account > Security > Generate Tokens and copy the token and paste it somewhere, we need it later.
Select the application project Click on ‘Administration’ -> ‘Update Key’ to find the key for the project.
Now back in Azure DevOps we need to add SonarCloud tasks. Go to the build pipeline and install SonarCloud plugin from marketplace. Just like WhiteSource bolt, search for Sonarcloud and install it in our Azure DevOps Organization.
Unlike WhiteSource bolt, we need to add three tasks for analyzing the code with SonarCloud.
Note that the project I’m trying to analyze is .NET Core, but the process of including the steps doesn’t vary much for any of the other technologies.
Add the ‘Prepare analysis on SonarCloud’ task before Build task.
Provide following details for the task:
- SonarCloud Service Endpoint: Create a new service endpoint by clicking on ‘new’ and copy paste the code generated, give a name for Service connection name, Save and verify
- Select the organization
- Add Project key generated from SonarCloud earlier.
Below screenshot shows how to add a new service connection after clicking on ‘new’ in step 1.
Add ‘Run code Analysis’ and ‘Publish Quality Gate Result’ tasks and save it and create a build.
Publish Quality Gate Result task is optional, but it can be added to publish the report link and quality gate result status to the pipeline.
Save and initiate a build. Once you run it, you should see the logs as below:
In the build summary, under extensions tab, you can see the link to view the results.
In the above screen, the quality gate status shows as none. the reason for that is, in Sonarcloud Initial status for quality gate shows as “Not computed” for the project we imported.
To fix it, under administration tab, choose "Previous Version" and notice that it says that 'changes will take effect after the next analysis'.
Now, the status in overview shows that “Next scan will generate a Quality Gate”
Back in Azure DevOps, trigger another build and wait for it to complete.
Now under extensions tab of build summary, it should show the result status along with the link to view the Bugs, Vulnerabilities, etc. click on the "Detailed SonarCloud Report" to view the results.
The beauty of Sonarcloud is that you can integrate in your branch polices for any new Pull Requests raised and also as one of the deployment gates for deploying the bug free code to your environments.
3. ZAP Auto Scanner:
One tool to consider for penetration testing is OWASP ZAP. OWASP is a worldwide not-for-profit organization dedicated to helping improve the quality of software. ZAP is a free penetration testing tool for beginners to professionals. ZAP includes an API and a weekly docker container image that can be integrated into your deployment process.
Definition credits: owasp.org
With ZAP scanner you can either run a passive or active test. During a passive test, the target site is not manipulated to expose additional vulnerabilities. These usually run pretty fast and are a good candidate for CI process. When the an active san is done, it is used to simulate many techniques that hackers commonly use to attach websites.
In your release pipeline, click on add to add a new stage after PreProd stage.
Create a new stage with ‘Empty Job’.
Rename it to DAST Testing.
Click on add tasks and add get ‘ZAP Auto Scanner’ task from market place.
Once done, add following tasks one after the other.
OWASP Zap Scanner:
- Leave Aggressive mode unchecked.
- Failure threshold to 1500 or greater. This to make sure that the test doesn’t fail if your site has score more in number. Default is 50.
- Root URL to begin crawling: Provide your URL that the scan needs to run against.
Word of Caution: Don't provide any site URLs in the above step that you don't own. crawling against sites that you don't own is considered as hacking.
4.Port: default is 80, if your site is running on secure port, provide 443, else you can leave it to port 80
Nunit template task: this is mainly used to install a template that is used by Zap scanner to produce a report.
The in-line script used is present in description of the tool in Azure Market Place.
Generate nunit type file task: This used to publish the test results in XML format to owaspzap directory in the default working directory.
Publish Test Results task: this is mainly used to publish the test results from the previous task.
Make sure that you select the agent pool as ‘ubuntu-18.04’
Once everything is done, Kick off a release. Make sure that Preprod stage is deployed and the environment is ready before running DAST testing stage.
Once a release is complete, you should be able to see the results in the tests tab of the release you created.
With this, We have seen how to integrate security testing using WhiteSource Bolt, SonarCloud and OWASP ZAP Scanner in our DevOps pipeline at various stages of build and release.
This brings us to the end of this blog post.
Just like DevOps, DevSecOps also needs cultural shift. It needs collaboration from all departments of an organization to achieve security at each level.
Hope you enjoyed reading it. Happy Learning!!
Couple of references I used for writing this blog post: