DEV Community

Adam the Automator
Adam the Automator

Posted on • Originally published at adamtheautomator.com on

Running PowerShell Scripts in Azure DevOps Pipelines (1 of 2)

This article was originally posted on the Adam the Automator blog. If you enjoy this article, be sure to come and check out this and hundreds of other sysadmin, cloud and DevOps posts!

Did you know you can natively run scripts like PowerShell and Bash in Azure DevOps (AzDo) pipelines? By using the tips and techniques you’ll learn in this article, you’ll be well on your way to scripting your way to automation greatness.

If you’re building pipelines with Azure Pipelines, you’re familiar with tasks. Tasks are the building blocks of Azure DevOps (AzDo) pipelines. AzDo has many built-in tasks and also allows you to download other tasks via an extension in the extension marketplace. But, there will inevitably come a time when you need to perform some action that doesn’t have a task available.

It’s time to break out a script.

AzDo can natively run three types of scripts - PowerShell, Bash, and batch files. Using one or more of these scripting languages/techniques, you can get just about anything done.

In this first article of a two-part series, you’re going to learn how scripts work in AzDo pipelines. You’ll learn all about how to invoke code, saved scripts in your source control repositories, and also how to work with pipeline variables in scripts.

If you’d like to see an example-driven, hands-on tutorial demonstrating the concepts covered here, be sure to check out the second article Running Scripts in Azure DevOps Pipelines (2 of 2) [Ultimate Guide] .*

How Scripts Work in AzDo Pipelines

Inside of each AzDo pipeline is a series of tasks. A task is defined as a step. These tasks represent a particular action like running a .NET build, deploying a web application, running a test, etc. Tasks are the building blocks for a pipeline.

AzDo uses the concept of a task to run existing scripts or code in the YAML pipeline itself. More specifically, a task can run a PowerShell, Bash, or batch file script on pipeline agents like Windows, Linux, and macOS. When a task is invoked, you can specify what agent (OS) to run the script on and any parameters the code/script has.

You can find PowerShell or Bash script tasks in the task picker in the web interface, just like any other task.

You can run a PowerShell task on Windows with Windows PowerShell, Linux, and macOS with PowerShell (Core) and Bash on Linux and macOS. You can run batch files as well on Windows, but if you’re doing this, I highly encourage you to use PowerShell instead.

Each scripting task is defined as a step in the pipeline, and you have a few different ways to assign tasks to execute a script like passing in parameters, failing on error, getting the last exit code, and so on. You’ll learn, in detail, how to build these tasks in the following sections.

Another essential concept to learn is how pipeline variables integrate with scripts. You can still define and manage “script variables” like $var = 123 in PowerShell, var = 123 in Bash, and maintain environment variables without AzDo being involved. However, AzDo allows you to set and reference pipeline variables in scripts too.

Inline Code vs. Scripts

Throughout this article, you’ll see references to running “scripts.” Although accurate, it reads like you have to create your own text file, insert the code you’d like to execute, and only then the pipeline will run that script. That’s not true.

You can also run inline code. Inline code runs directly in the YAML pipeline. You don’t have to create a script ahead of time to run PowerShell or Bash code. Instead, you can insert the code directly in YAML. There’s no reference to a PS1 or SH file. AzDo creates a temporary script when the pipeline runs.

One-Line

You can run inline code one of two ways in a pipeline either via a single line or multi-line. If you have a short code snippet as a single line, you can specify the task type followed by the code in quotes, as shown below.

- powershell: "I'm running PowerShell on a pipeline agent. Woohoo!!"
Enter fullscreen mode Exit fullscreen mode

Multi-Line Code

If you have a code snippet that spans a few lines or perhaps you don’t want to use quotes, you can define a multi-line code snippet by using a pipe (|) symbol followed by one or more lines right below it as shown below.

- powershell: |
      This is script line 1
        This is script line 2
Enter fullscreen mode Exit fullscreen mode

It’s typically best to only use inline code for small tasks with less than five lines or so. Any other task that requires more than that should probably go in a script in your source control repository.

The upside of using inline code is keeping all functionality in a single place, making it easier to see everything that’s going on. But, this approach can soon get confusing if you have a large pipeline.

Where to Store your Scripts

If you’re not using an inline code task to run a script, you’ll have to save a script somewhere. You’ve got a few options where to store scripts executed via the pipeline.

A Source Control Repository

If you’re using a pipeline trigger from a GitHub or AzDo source control repository running a CI pipeline, try to store your scripts in the same repo. It’s easier to manage all files related to a project this way. By default, AzDo will check out all code in the source repo. Checking out the code will download all files from the repo onto the pipeline agent, making them immediately available for execution.

If you have scripts located in another GitHub repo, you can also check out multiple repos to download and run scripts stored in other repos too.

Once the scripts are downloaded to the pipeline agent, you can then reference them in a task via the [System.DefaultWorkingDirectory predefined variable](https://docs.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#system-variables).

For example, if you have a PowerShell script called script.ps1 stored in the root of your source repo, AzDo will check out the file placing it in the System.DefaultWorkingDirectory folder path. It can then be referenced using the PowerShell task, as shown below.

- task: PowerShell@2
    inputs:
        filePath: "$(System.DefaultWorkingDirectory)\script.ps1"
Enter fullscreen mode Exit fullscreen mode

Somewhere Else

If a script isn’t located in a source control repo, you can still run it in a pipeline. How you do this, though, highly depends on the situation. For example, you could use a PowerShell task to download another script and run it.

steps:
- powershell: |
        Invoke-WebRequest -Uri https://somesite.com/script.ps1 -OutFile script.ps1
        .\script.ps1
Enter fullscreen mode Exit fullscreen mode

It doesn’t matter where the script is stored. As long as you can use a task or run a script to authenticate (if necessary) and download the script, the PowerShell or Bash task will run it for you.

Pipeline Variables in Scripts

Just like you have variables in a script, you also have variables in a pipeline. Variables are defined a few different ways and their value can be accessed differently depending on the context.

Not only can you define and read variable values in the YAML pipeline, you can also do so within scripts.

Reading Pipeline Variables

When a script is run via a pipeline, the pipeline exposes all currently-defined variables as environment variables on each pipeline agent. This means that you can read pipeline variables in a script just like any other environment variable.

For example, perhaps you have defined a variable called foo under the variables section of the pipeline.

variables:
    - name: foo
        value: 'bar'
Enter fullscreen mode Exit fullscreen mode

Since the pipeline exposes this variable as an environment variable, you can then reference the value of that variable in your scripts like usual.

## PowerShell
$env:foo

## Bash
$foo
Enter fullscreen mode Exit fullscreen mode

For pipeline variables defined with a dot (.), AzDo will automatically convert these dots to underscores as environment variables. If a pipeline variable is defined as [foo.bar](http://foo.bar) for example, the environment variable will be foo_bar.

Setting Pipeline Variables

Setting pipeline variables isn't quite as straightforward as reading them. To set a pipeline variable via script, you must use a logging command. A logging command is how a script communicates with the pipeline agent.

To set pipeline variables via a script, you must output a specifically-crafted string to standard out in the script. The string must have the format of "##vso[task.setvariable variable=[variable_name];][variable_value]".

For example, to set a pipeline variable foo to the value of bar, a PowerShell or Bash script would have to output a string like below.

echo "##vso[task.setvariable variable=foo;]bar"
Enter fullscreen mode Exit fullscreen mode




Custom Script Extensions

AzDo provides the PowerShell and Bash script tasks by default. But you can also download or even build your own script-based tasks in the form of an extension.

Using the standard script tasks, you’re writing all of the code yourself and invoking it in one shot. There’s no real “interface” to the code. AzDo gives you a box saying, “Insert code here,” you put in the code, and then the pipeline executes it. There’s no real structure around it.

Notice when you create a PowerShell task below, you don’t have many options. The parameters for this task are minimal. There’s an Arguments field to pass parameters to the script, but wouldn’t it be easier to understand if Arguments wasn’t some generic name?

Maybe you have a script you regularly use for querying a set of machines and returning some report. The script is called Get-AcmeServerReport. It has parameters like ServerName to specify the servers to run against and ReportFilePath for where to save the report.

Wouldn’t having fields in the web UI matching the script parameters be a lot more intuitive like you can see below?

This article isn’t going to cover building custom AzDo extensions, but you should know this is possible. You can move the “interface” up to the pipeline level instead of down at the script level, allowing you to reuse existing scripts easily.

Summary

In this first article in a two-part series, you learned about executing scripts in Azure Pipelines. We covered the basics of how Azure Pipelines invoke scripts, how pipeline variables are used within scripts, where you can store scripts, and more.

Be sure to keep your momentum up by continuing to the second article in this series where you’ll get hands-on and learn via lots of examples!

Top comments (1)

Collapse
 
thebtm profile image
Brad

Do you have an example yaml file for creating the pipeline with a few powershell tasks?