Regardless of how your CI/CD process is set up, you must have a dedicated branch in your project from which you can deploy (typically the main branch), and any merge from any other feature branch to it should be controlled by tests and quality checks, which is where unit/integration tests and SonarQube checks come in.
When we talk about unit and integration testing, we’re automatically talking about code coverage, which is the percentage of code covered by the tests. We must set a threshold for it, and any coverage below that threshold will result in a merge request failure.
In this tutorial, i’ll show you how to use Gitlab CI to run unit and integration tests, as well as SonarQube scans when creating a merge request.
Use case
For our use case in this tutorial, we have a python API developed using FastAPI, on which we’re going to run unit and integration tests using PyTest and generate the coverage report using Coverage.
PyTest: Pytest is a Python testing framework based on the PyPy project. It can be used to create a variety of software tests such as unit/integration tests, integration tests, end-to-end tests, and functional tests. It has parametrized testing, fixtures, and assert re-writing capabilities.
Coverage: Coverage is a Python program for measuring code coverage. It monitors your program, giving details about which lines of the code were tested, and which ones were not.
Gitlab CI
GitLab CI is one of the most important DevOps tools. It was created by GitLab and is used to facilitate and accelerate the application distribution process by utilizing continuous integration methodologies (Continuous Integration CI / Continuous Development CD / Continuous Deployment CD).
Pipelines are configured in the root of a project using a version-controlled YAML file called gitlab-ci.yml. You can then configure your pipeline’s parameters :
- What to execute
- What happens when a process succeeds or fails
The YAML file is configured using Jobs and Stages :
- Job : Instructions that a runner has to execute.
- Stage : A keyword that defines specific stages of a job, such as build and deploy. Jobs of the same stage are executed concurrently.
To run our pipeline we need runners : agents or servers that executes each job individually that can spin up or down as needed.
To run the unit/integration tests, our code might need some environment variable, such as credentials to connect to a test database. To add environment variables in Gitlab, we follow the listed steps below :
- Go to the project page in Gitlab
- Click Settings
- Go to CI/CD
- Go to the Variables section and click Expand
- Click Add variable and it will prompt the following window
- Specify the variable name in the Key field, and the variable value in the value field
- Choose if it’s going to be protected or not, masked or not, and click Add variable
- And it’s Done
Gitlab-ci.yml file to run test and SonarQube checks
Here’s the file we used for our use case, and also it can be used for all python projects:
# The stages section is used to specify the order of execution of our stages
stages:
- test-runner
- sonarqube-check
#the test-runner job is the job where our test will be executed
test-runner:
stage: test-runner
image:
name: python:3.8-slim
before_script:
- pip install pytest pytest-cov coverage
- pip install --no-cache-dir -r requirements.txt
script:
- coverage run -m pytest
- coverage report -m
- coverage xml
coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
artifacts:
path:
- coverage.xml
allow_failure: false
only:
- merge_requests
#the sonarqube-check job is the job where SonarQube will check for the code quality
sonarqube-check:
stage: sonarqube-check
image:
name: sonarsource/sonar-scanner-cli:latest
entrypoint: [""]
variables:
SONAR_USER_HOME: "${CI_PROJECT_DIR}/.sonar" # Defines the location of the analysis task cache
GIT_DEPTH: "0" # Tells git to fetch all the branches of the project, required by the analysis task
cache:
key: "${CI_JOB_NAME}"
paths:
- .sonar/cache
script:
- sonar-scanner
allow_failure: true
only:
- merge_requests
Now let’s explain what’s done here :
This is our pipeline, it contains two stages, each one contains one job (a stage can contain one or more jobs), each job would be run in a docker container.
The “stages” section : This section is used to specify the order of execution of our stages using the stage names of our stages. For our case, the CI pipeline will execute the test-runner stage, when it’s done, it will execute the “sonarqube check” stage.
First Stage (Running the test) :
- The first stage named test-runner and contains a job with the same name
- This job will be running in the python:3.8-slim docker container, to be able to use python.
- Before this job run the testing script, we need to install the needed packages Pytest and Coverage, that’s what we did in the before_script section
- After the packages are installed, we run the tests, and we specify the commands in the script section
- Than we specify where the coverage report will be placed, and that is specified in the artifacts section
- The allow_failure is set for false, so if one test fail, the job will fail
Second stage (SonarQube checks) :
- The second stage named sonarqube-check and contains a single job with the same name
- This job will be running in the sonarsource/sonar-scanner-cli:latest docker container
- In the variables section, we specify the sonar user directory for so he can have access to check the code in this directory
- In the cache section, we specify the caching directory
- For the only section, it’s used to specify when this job will could run
Top comments (0)