DEV Community

Isaac Lenton
Isaac Lenton

Posted on

Trying out GitHub Actions with Matlab

Recently I noticed that Mathworks has released a set of GitHub actions for running short Matlab scripts and unit tests. Last year I considered adding CI support for one of my Matlab project but trying to work out how to setup the Matlab license proved too difficult for me at the time. Now with the new actions, I thought: Why not try setting up CI again?

This post describes the steps I took to learn the basics of GitHub Actions and add CI to one of my Matlab projects. I've never used GitHub Actions before, so what better time than during the Actions hackathon! The project I choose to add Actions to is a branch of the Optical Tweezers Toolbox.

Part 1: Adding the action to the project

This step was relatively simple: I stared by going to the Matlab tests example and copying the GitHub hosted tests example into a new action on my repository. In my development branch I created a file .github/workflows/runtests.yml:

name: Unit Tests

# Controls when the workflow will run
# Add [skip ci] to the commit to skip running this job
on:
  push:
    # My development branch was called draft162
    branches: [ draft162 ]

jobs:
  test:
    name: Run MATLAB Tests and Generate Artifacts
    runs-on: ubuntu-latest
    steps:
      - name: Check out repository
        uses: actions/checkout@v2
      - name: Set up MATLAB
        uses: matlab-actions/setup-matlab@v1

      - name: Run tests and generate artifacts
        uses: matlab-actions/run-tests@v1
        with:
          test-results-junit: test-results/results.xml
          code-coverage-cobertura: code-coverage/coverage.xml

          # I already had a folder called `tests` containing my
          # existing Matlab unit tests that I would run locally.
          # Each test file matches the format: test*.m
          select-by-folder: tests
Enter fullscreen mode Exit fullscreen mode

As soon as I pushed the commit the action started running! No additional steps to setup Matlab! No additional third party tools required!

The Action seems to use a Sponsored Matlab license, although I have no idea what this means. The result was that I didn't need to point Matlab at my universities Matlab license, which made things much easier to setup/run.

After about 4 minutes my tests had run and I could view the logs to find out which tests passed and which tests failed. Overall the process was much easier than I expected. However, to view the logs, we need to be able to download the coverage report or display it after the Action runs; and browsing through the action logs isn't very convenient for finding out if the tests were successful. It would be nice if there was some graphical indication of the testing/coverage status (such as a badge or a success/failure count).

Part 2: Adding a badge

I'd seen testing badges displayed on other repositories that used GitHub Actions. After a quick google I found instructions for how to generate the appropriate URL. Now I can easily display the status of the action from within the project readme or elsewhere in the documentation.

Later I discovered that I could also have clicked the "Create Status Badge" button within the test results page to automatically format this URL:

Create status badge button location

Part 3: Saving coverage and test reports

The action included two directives to specify output files for the test-results and coverage-report

          test-results-junit: test-results/results.xml
          code-coverage-cobertura: code-coverage/coverage.xml
Enter fullscreen mode Exit fullscreen mode

In order to view the contents of these files I needed to create an artifact by adding the upload-artifact action immediately following my testing action, for example:

 - name: Upload results artifact
        uses: actions/upload-artifact@v2
        if: always()
        with:
          name: test-restuls
          path: test-results/results.xml
Enter fullscreen mode Exit fullscreen mode

In this action I needed to specify the if: always() line to force the action to always be run, even if some of the unit tests failed.

I also tried an alternative approach involving adding the continue-on-error: true statement to the testing action and adding an additional non-zero return status after all the other actions (see this SO for further details). This approach also worked, but it resulted in a slightly more verbose yaml file.

The result of adding the upload-artifact action was two archives that I could now download from the Action summary page:

Artifact example

I was then able to view these XML files in my favorite viewer.

Part 4: In browser display of testing reports

Downloading the XML files is not the most convenient workflow. It would be useful to have at least a summary in the browser with the current number of passing/failing test. Taking a quick look at the GitHub marketplace, I found a action that displays exactly this (and more)

GitHub logo EnricoMi / publish-unit-test-result-action

GitHub Action to publish unit test results on GitHub

GitHub Action to Publish Unit Test Results

CI/CD GitHub release badge GitHub license badge GitHub Workflows badge Docker pulls badge

Ubuntu badge macOS badge Windows badge

This GitHub Action analyses Unit Test result files and publishes the results on GitHub. It supports the JUnit XML file format and runs on Linux, macOS and Windows.

You can add this action to your GitHub workflow for Ubuntu Linux (e.g. runs-on: ubuntu-latest) runners:

- name: Publish Unit Test Results
  uses: EnricoMi/publish-unit-test-result-action@v1
  if: always()
  with
    files: test-results/**/*.xml
Enter fullscreen mode Exit fullscreen mode

Use this for macOS (e.g. runs-on: macos-latest) and Windows (e.g. runs-on: windows-latest) runners:

- name: Publish Unit Test Results
  uses: EnricoMi/publish-unit-test-result-action/composite@v1
  if: always()
  with:
    files: test-results/**/*.xml
Enter fullscreen mode Exit fullscreen mode

See the notes on running this action as a composite action if you run it on Windows or macOS.

Also see the notes on supporting pull requests from fork repositories and branches created by Dependabot.

The if: always() clause guarantees that this action always runs, even if earlier steps (e.g…

I added this action to my yaml file, just as I did for the upload action previously:

- name: Publish Unit Test Results
        uses: EnricoMi/publish-unit-test-result-action/composite@v1
        if: always()
        with:
          files: test-results/results.xml
          report_individual_runs: "true"
Enter fullscreen mode Exit fullscreen mode

Running this action adds an additional tab to the action summary which has annotations for each failed test.

Test report example

The report_individual_runs option seems to be needed to get GitHub to display an annotation for each test (as opposed to just one annotation for the whole file).

Summary

So far I've been able to setup an action to test most of the code, there are still many things that could be improved including:

  • Work out how to display coverage reports. I tried using Code Coverage Summary but couldn't work out why it was not running.
  • Look into speeding up CI time by running tests only for affected directory
  • CI for graphical user interfaces (not yet supported by Activity)

My Workflow

Actions used:

Submission Category:

Wacky Wildcards or Maintainer Must-Haves, not sure which fit is better.

Yaml File or Link to Code

Full yaml file (and a couple of extras):

# Action to runs tests on repository

name: Unit Tests

# Controls when the workflow will run
# Add [skip ci] to the commit to skip running
on:
  push:
    branches: [ draft162 ]

jobs:
  test:
    name: Run MATLAB Tests and Generate Artifacts
    runs-on: ubuntu-latest
    steps:
      - name: Check out repository
        uses: actions/checkout@v2
      - name: Set up MATLAB
        uses: matlab-actions/setup-matlab@v1

      - name: Run tests and generate artifacts
        uses: matlab-actions/run-tests@v1
        with:
          test-results-junit: test-results/results.xml
          code-coverage-cobertura: code-coverage/coverage.xml

          # Exclude GUI, not yet supported by test runner
          select-by-folder: >
              tests/beam; tests/bsc; tests/drag; tests/dynamics;
              tests/examples; tests/particle; tests/shape; tests/tmatrix;
              tests/tools; tests/utils
          # Folders we should run coverage reports on
          source-folder: +ott; examples

      - name: Publish Unit Test Results
        uses: EnricoMi/publish-unit-test-result-action/composite@v1
        if: always()
        with:
          files: test-results/results.xml
          report_individual_runs: "true"

      # Not sure why, but this doesn't seem to work :(
      - name: Code Coverage Summary Report
        uses: irongut/CodeCoverageSummary@v1.2.0
        if: always()
        with:
          filename: code-coverage/coverage.xml
          format: markdown
          badge: true

      - name: Upload results artifact
        uses: actions/upload-artifact@v2
        if: always()
        with:
          name: test-restuls
          path: test-results/results.xml

      - name: Upload coverage artifact
        uses: actions/upload-artifact@v2
        if: always()
        with:
          name: code-coverage
          path: code-coverage/coverage.xml
Enter fullscreen mode Exit fullscreen mode

Additional Resources / Info

This page describes the Actions added to a branch of the Optical Tweezers Toolbox, a toolbox for modelling particles trapped in optical fields. The Actions should hopefully be merged into the master branch sometime later this year.

GitHub logo ilent2 / ott

Optical Tweezers Toolbox (Version 1)

ott - Optical Tweezers Toolbox

DOI Documentation Status View Optical Tweezers Toolbox on File Exchange

The optical tweezers toolbox can be used to calculate optical forces and torques of particles using the T-matrix formalism in a vector spherical wave basis The toolbox includes codes for calculating T-matrices, beams described by vector spherical wave functions, functions for calculating forces and torques, simple codes for simulating dynamics and examples.

We are currently working on documentation and we welcome feedback/suggestions/comments. Additional documentation can be found via the Matlab help command or in the source code.

Installation and usage

There are several methods for installing the toolbox If using Matlab, the easiest method is to launch Matlab and navigate to Home -> Addons -> Get-Addons and search for "Optical Tweezers Toolbox". Then, simply click the "Add from GitHub" button to automatically download the package and add it to the path Alternatively, you can download the toolbox directly from the GitHub repository or select a…

Discussion (0)