Getting started with Azure PowerShell docker Image and GitHub Actions

omiossec profile image Olivier Miossec ・7 min read

You may have noticed, we now have an official Docker image for the Azure PowerShell module
These images are updated almost every day, and the latest image contains the latest version of the Azure PowerShell module.

The base image is a powershell:ubuntu-18-04(it, also, exists with CentOS 7 and Debian 9), so it will operate only on the Linux platform. The PowerShell version installed is 6.2.4 or 7.0 depending on the image tag.
You can use it on Linux or with Docker Desktop on Windows 10.
The latest image corresponds to the latest version of the Azure PowerShell module (currently 3.6.1). But you can use tags to select other versions (3.4.0 and 3.5.0 tags currently).
To use it:

docker run -it mcr.microsoft.com/azure-powershell:3.4.0-ubuntu-18.04

You can execute a script from the local machine. Suppose you have a scripts directory in your home folder, you can add it as a volume and mount it on /src and

docker run -it -v ~/scripts:/src mcr.microsoft.com/azure-powershell pwsh -file /src/test.ps1

It is possible to use a saved context from the host machine in the container. But first, the Azure Context needs to be generated on the host computer.


This command will create several files in the .Azure directory in your home folder.
You can use these files in the container by mounting a volume.

docker run -it -v ~/.Azure/AzureRmContext.json:/root/.Azure/AzureRmContext.json -v ~/.Azure/TokenCache.dat:/root/.Azure/TokenCache.dat -v ~/scripts:/src mcr.microsoft.com/azure-powershell pwsh -file /src/GetRgList.ps1

Where GetRgList.ps1 contains

Get-AzResourceGroup | Select-Object ResourceGroupName,location

While it can be useful in some circumstances, like testing a script in another environment or discovering the breaking changes in the module update,one of the best use cases is to use it in a CI/CD chain.

GitHub action allows you to use docker images to build custom tasks in a workflow. To create a workflow from the Azure PowerShell image you need to create two directories in your GitHub repository, psaz-361, and psaz-330 (for the two versions of the Azure PowerShell module).
In each folder we need to add;

  • A dockerfile to set up the container
  • A yaml file, action.yml to describe how to use the image in the workflow
  • A PowerShell script to perform your custom actions.

The dockerfile for the pzaz-361

FROM mcr.microsoft.com/azure-powershell:3.6.1-ubuntu-18.04

ENV PSModulePath /usr/local/share/powershell/Modules:/opt/microsoft/powershell/7/Modules:/root/.local/share/powershell/Modules

RUN pwsh -c install-module -name pester -force

ADD entrypoint.ps1 /entrypoint.ps1

ENTRYPOINT ["pwsh", "/entrypoint.ps1"]

For the pzaz-330, you need to use the 3.4.0 image

FROM mcr.microsoft.com/azure-powershell:3.4.0-ubuntu-18.04

ENV PSModulePath /usr/local/share/powershell/Modules:/opt/microsoft/powershell/7/Modules:/root/.local/share/powershell/Modules

RUN pwsh -c install-module -name pester -force

ADD entrypoint.ps1 /entrypoint.ps1

ENTRYPOINT ["pwsh", "/entrypoint.ps1"]

You need to modify PSModulePath environment variable as the Azure PowerShell module is located in /root/.local/share/powershell but the instance runs under the GitHub context and the default path is /github/home/ instead of Root.

If you don’t change the module path, the Azure PowerShell module will not load during the GitHub workflow.
If you want to use it later, you need to install Pester thereto.
Now that we have our image, we need to think about how to connect to Azure from the instance is GitHub action. We cannot use a standard account as it may have some restrictions like MFA that will prevent it to connect.Rather, you need a service principal.
There are two options to create a service principal for GitHub Action, using a service principal secret or using a certificate.
The service principal secret is the easiest way.

$ServicePrincipal = New-AzADServicePrincipal -DisplayName "GithubAction-demo" -SkipAssignment 
We can now assign a role to our application 
New-AzRoleAssignment -RoleDefinitionName Reader -ServicePrincipalName $ServicePrincipal.ApplicationId

We need to extract the service principal secret to use it later.

$SecureStringBinary = [System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($servicePrincipal.Secret)

But we cannot use the Password and the application directly into the PowerShell script. It will not be secure.
GitHub action supports the use of secrets in yours workflows. Open one of your repositories and go to Settings and then to Secrets. You can add passwords by clicking on Add a new secret.
For the connection with a service principal, you need to create 3 secrets, one for the AppId, one for the tenant ID and another for the Service Principal Password, APPID, TENANTID and SERVICEPRINCIPALSECRET
Now we need to pass these three elements to the instance. Each running workflow must start with these parameters.

The Action metadata file, action.yaml, define how the workflow should be implemented. It defines what is expected as input, what will be the output and which action to execute, the entrypoint.ps1 script.

name: 'AzurePowerShellAction'
author: 'omiossec'
description: 'Perform Pester Tests'
  icon: 'cloud'
  color: 'blue'
    description: 'Directory to test'
    default: "."
    required: false
    description: 'The Service Principal Application ID'
    required: true
    description: 'The Service Princtipal Secret'
    required: true
    description: 'The Tenant ID'
    required: true
  using: 'docker'
  image: 'Dockerfile'
    - ${{ inputs.directory }}
    - ${{ inputs.appID }}
    - ${{ inputs.spSecret }}
    - ${{ inputs.tenantID }}

The entrypoint.ps1 script will be the heart of the action. The script lets you define any actions you want; deploy a template, update a web application, … or simply test a PowerShell script or a module against multiple versions of the Azure PowerShell Module in a Linux.
The first problem to solve is how to connect to Azure. We added secrets in GitHub to store the service principal AppID and Secret along with the tenant ID. In the action.yaml file we defined these secrets as inputs and as arguments for the docker image. In other words, to run the workflow, you need to provide the AppID, the service principal secret, and the tenant ID. These values are, then, passed to the container instance as environment variables.

To access these values from PowerShell you just need to use the form INPUT_ before the name of the value, AppID becomes INPUT_APPID.
In the PowerShell script, you will need to use $env:INPUT_ARGNAME to handle the data. All variables must be capitalized.
To connect to azure with a service principal you will need to create a PScredential object. There are two measures, transform the plain text Service Principal secret to a Secure String and create the PsCredential object.

$SpSecret = ConvertTo-SecureString  $env:INPUT_SPSECRET -AsPlainText -Force

$SpCredential = New-Object System.Management.Automation.PSCredential $env:INPUT_APPID,$SpSecret

You can now connect to Azure

Connect-AzAccount -Credential $SpCredential -Tenant $ENV:INPUT_TENANTID -ServicePrincipal

The complete script

import-module AZ
$SpSecret = ConvertTo-SecureString  $env:INPUT_SPSECRET -AsPlainText -Force
$SpCredential = New-Object System.Management.Automation.PSCredential $env:INPUT_APPID,$SpSecret

Connect-AzAccount -Credential $SpCredential -Tenant $ENV:INPUT_TENANTID -ServicePrincipal

$TestsDir = join-path -Path $Env:INPUT_DIRECTORY -ChildPath "tests"
Push-Location $TestsDir
$PesterResult = invoke-pester -PassThru 

if ($PesterResult.FailedCount) {
    write-output "Fail"
    throw "Unit testing not passing"
} else {
    Write-Output "Success"

The script connects to Azure and changes the default location to tests folder. Pester, the PowerShell testing tool is invoked with the PassThru switch to collect test data. If one test fails, FailedCount will be defined and it will throw an error and the GitHub workflow will fail.
The module tested contains only two functions, one to test if there is a connection to an Azure subscription and another to collect Function App data.

Function Test-MyConnection 
    try {
        $AzContext = get-azContext 
        if ($null -eq $AzContext) {
            return $false
        else {
            return $true
    catch [System.Management.Automation.CommandNotFoundException] {
        write-error "No AZURE PowerShell module"
        return $false
    catch {
        Write-Error -Message " Exception Type: $($_.Exception.GetType().FullName) $($_.Exception.Message)"
        return $false

Function Get-MyFunctionsApp {
        [parameter(Mandatory = $true)]
        [parameter(Mandatory = $true)]
    if (Test-MyConnection)  {
        try {
            $FunctionAppConfig = Get-AzWebApp -ResourceGroupName $FunctionResourceGroup -Name $FunctionAppName 
            $FunctionStorageConfigString = ($FunctionAppConfig.SiteConfig.AppSettings | where-object name -eq "AzureWebJobsStorage").Value 

            return [pscustomobject]@{
                FunctionAppName                 = $FunctionAppName
                FunctionAppLocation             = $FunctionAppConfig.Location
                FunctionHostName                = $FunctionAppConfig.HostNames[0]
        catch {
            Write-Error -Message " Exception Type: $($_.Exception.GetType().FullName) $($_.Exception.Message)"
    else {
        throw "Not connected to Azure, use Login-AzAccount first"

The pester test scripts.

$ModuleName = "myAzFuncModule"
$ModuleManifestPath =  "./src/module/myAzFuncModule.psd1"
Get-Module -Name $ModuleName | remove-module

$ModuleInformation = Import-module -Name $ModuleManifestPath -PassThru
Describe "$ModuleName Testing"{
    InModuleScope $ModuleName {
        Context "$($ModuleName) Cmdlet testing" {
            It "Test-MyConnection Should return true" {
                Test-MyConnection | Should -Be $true

Will load the module and perform a test to the first function Test-MyConnection. If the function returns true if the connection is opened in the entrypoint.ps1 work, the test pass.

Alone these files will not start the GitHub Action. You need to create the workflow.
For that, you need to create a .github folder at the root of the repository. Inside, create a folder named workflows. Any yaml files in this folder will trigger a GitHub action.
You need two tests one for the 3.6.1 version and the other form the 3.3 version.
To perform the two tests, create a new YAML file called dockertest.yaml.

name: docker-wks

on: [push]

    name: Test Job
    runs-on: ubuntu-latest
      - uses: actions/checkout@v1

      - name: PesterTest-361
        uses: ./psaz-361
          appID: ${{ secrets.APPID }}
          spSecret: ${{ secrets.SERVICEPRINCIPALSECRET }}
          tenantID: ${{ secrets.TENNATID }}

      - name: PesterTest-33
        uses: ./psaz-330
          appID: ${{ secrets.APPID }}
          spSecret: ${{ secrets.SERVICEPRINCIPALSECRET }}
          tenantID: ${{ secrets.TENNATID }}

The name at the top of the file is the name of the workflow. The clause ON tells GitHub when to start a workflow. Which event can trigger the action? It can be a Push, a pull request, a webhook event, a deployment or a schedule and more.
Here we only have Push. It’s because we need to use secrets and it’s not possible to use secret with Pull_request.

The Job section describes how the workflow run, we have a name and the runs-on parameter tell GitHub the OS to use.
The Steps section lets you define how to run your two dockers instance. First, you need to use Checkout to get your repository inside the workspace. Then you can describe how to mount Docker images.

For each instance, you need to give a name and the path to the docker image you want to use. The syntax is simple, you can use ./MyDockerFolder if the image is within the repository or RepoOwner/DockerActionRepos if the image is outside.

Then with the keyword with, the data requested in the action.yaml file.

After pushing the data to your GitHub, you can go to Action and see the result.

The Azure PowerShell docker can be useful for local tests, integration and of course deployment actions. You can use it in a CI/CD chain to deploy ARM Template or execute PowerShell scripts with the exact version of the Azure PowerShell module you need.


Editor guide