Azure Functions v2 Trigger and Binding with PowerShell Core

omiossec profile image Olivier Miossec ・7 min read

This post is about PowerShell in Azure Functions v2. It’s a how to use basic triggers and bindings with powershell.

With PowerShell Core, Managed Identities and the integration of the AZ Module, PowerShell Azure Functions can be used as an Event Based Serverless automation tools.

Why event based? Azure Functions code is trigger by an event. It can be a http call, a log message in event grid, a message in a queue, a new blob in a container...

The event data doesn’t even need to be known by the function (it can be useful most of the time), the fact that the event occur is enough to run the code.

This is the list of triggers you can use with Azure Functions

  • Blob
  • Queue
  • HTTP
  • Cosmo DB
  • Event Grid
  • Event Hub
  • MS Graph Event
  • Service Bus
  • Timer

In a function, the code can be only trigged by one event. There is only one trigger. If the same code needs to run for a different event, a new message in a queue and a new blob in a container, you need two functions.

In addition to triggers there is bindings. Bindings associate a resource to the function. Bindings let you use Azure resources in your function without managing the connection to the resource in your code.

Bindings are available for input or for output or both. It depends on the resource.

Bindings Direction
Blob Input and Output
Queue Output
Table Input and Output
CosmoDb Input and Output
Event Hubs Output
Http Output
Graph Excel Input and Output
Graph Onedrive Input and Output
Graph Email Output
Graph Event Input and Output
Notification Hubs Output
SendGrid Output
Service bus Output
SignalR Input and Output

To access to the input object from the code in run.ps1, the name of the bindings in the function.json file must be added in param section of the run.ps1. To push data to a binding, the name of the binding must be used in the name parameter for the Push-OutputBinding cmdlet.

Take a look at a function.json


"bindings": [ 


"name": "TriggerBlob", 

"type": "blobTrigger", 

"direction": "in", 

"path": "workitems/{name}", 

"connection": "AzureWebJobsStorage" 



"type": "blob", 

"name": "inputBlob", 

"path": "incontainer/{name}.json", 

"connection": "AzureWebJobsStorage", 

"direction": "in" 



"type": "blob", 

"name": "outputBlob", 

"path": "outcontainer/{rand-guid}.json", 

"connection": "AzureWebJobsStorage", 

"direction": "out" 




And the related run.ps1

# Input bindings passed via param 

param([byte[]] $TriggerBlob, [String[]] $InputBlob, $TriggerMetadata) 

# Output passed via Push-OutputBinding 

Push-OutputBinding -name OutputBlob -value $SomeValue 

Each trigger and binding family have their own requirement and syntax. They use an extension (Microsoft.Azure.WebJobs.Extensions.xxxx) that need to be installed before.

Let’s take a look at some trigger and binding, http, queue, table and blob


This trigger represents what we think when we talk about serverless, a web-based API. This trigger responds to any web call on any http verbs.

It use the Microsoft.Azure.WebJobs.Extensions.Http package. Also, http functions use object from system.net namespace, it should be imported

Using namespace System.Net 

In the function.json it look like that


    "authLevel": "function",  

    "name": "req",   

    "route": "",  

    "type": "httpTrigger",  

    "direction": "in",  

    "methods": [ 






Change authorisation level for the function. An Azure Functions App is a WebApp and it’s open to everyone on Internet.

There are 3 authorisation levels:

  • Anonymous: No authentication required

  • Function: A key is need to access the function, this key is unique to the function, other functions in the function App will have different keys. The key need to be include in the code http parameter in the query or in the http header with the field x-function-key.

  • Admin: The same principe as function but with the master key, the key is the same for all the functions in the Function App.

There are other ways to protects the function. We can apply network restriction by using Access Restriction in the WebApp Plateforme Features, it’s also possible to change the Function App plan from consumption to have more control over the network.
For production API, you should also use Azure API management and/or App Service Authentication (to use Azure AD, Twitter, Facebook, ... account)

It let you define an url template on how you expect to see the request. For example products/{productcategory:alpha}/{id:int?} define to parameter ProductCategory and ID (optionnal).

Let you limit the verb for the function. By default, all http verbs can be used. Use this json array to limit verbs

By default, in, it’s a trigger

The name of the binding that will be use as parameter in run.ps1

This parameter in run.ps1 is an HttpRequestContext object. The properties of this object are:

  • Body: The body of a post http request
  • Headers: Http headers dictionary
  • Method: A String representing the http verb of the request
  • Params: Get params dictionary
  • Query: Dictionary of all the param in the query
  • URL: A String representing the url used to trigger the function

The HTTP trigger don’t need to have an http output. In this case the function returns a 204 no content http code.

To return a response we need an http output-binding.

 "type": "http", 
 "direction": "out", 
 "name": "Response" 

The field name will be used in the name parameter in Push-OutputBinding cmdlet.

The object used in value must be a HttpResponseContext custom type object.

It can contain a statuscode ([HttpStatusCode]::OK by default but it can be any StatusCode see MS Doc). A content Type contentType, by default text/plain, a boolean for content negociation, EnableContentNegotiation, a dictionary Headers to set custom headers and a body object, it can be a string or an array of bytes.

$HttpResponse = [HttpResponseContext]@{ 
 StatusCode = $HttpStatus 
 Body = $RestultResponse | ConvertTo-Json 
 ContentType = "Application/json" 

Push-OutputBinding -Name Response -Value $HttpResponse 

Storage account based trigger and binding
A storage account is linked to the function App. This storage Account store the Function App files in a share. When creating a storage binding or trigger you have add the connection name to the storage account to the binding in function.json.

Storage bindings and triggers work with the Microsoft.Azure.WebJobs.extensions.storage.


Queue is the simplest type to use among storage trigger and bindings.

"bindings": [    
         "name": "QueueItem",         
         "type": "queueTrigger",        
          "direction": "in",         
          "queueName": "mystoragequeue",         
          "connection": "AzureWebJobsStorage"     
        "type": "queue",         
        "name": "outputQueueItem",         
        "queueName": "vmalertequeue",         
        "connection": "AzureWebJobsStorage",         
        "direction": "out"     

The function run when a new message is added in the queue named mystoragequeue. The message is sent to the function by the parameter named QueueItem. QueueItem is a string.

param([string] $QueueItem, $TriggerMetadata) 


For binding, there is only one direction, out! And you can use string to for the value.

Push-OutputBinding -Name outputQueueItem-Value "Test"  

Last things, if for a raison or another the function is unable to process the message, the function App will try 4 more time to run the code. If the code is still unable to process, the message is marked as poisoned.


There are only bindings, in and out, for Azure storage table. There is no trigger when you add a new element in a table (for that you should take a look at the Cosmo DB trigger).


    "type": "table",       
    "name": "inputTable",       
    "tableName": "inTable",       
    "take": 50,       
    "connection": "AzureWebJobsStorage",      
    "direction": "in",      
    "partitionKey": "tablein",       
    "rowKey": "1"     

When the function is triggered by another object (Blob storage, queue, Http, …) it will look to the data in the table nammed from the tablename field. Take will limit the number of returned of rows, partitionkey and rowkey limit the query to these 2 elements.

TableName is the only mandatory field in the binding.

In this example, the parameter inputTable return an array of hash, one hashtable per row.



    "type": "table",  

    "name": "outputTable",  

    "tableName": "outTable",  

    "connection": "AzureWebJobsStorage",  

    "direction": "out"  


To send data to the table you only need to send an object with push-outputBinding. The object must be a hashtable or an array of hash.

In each hash table there must be at least two field, partitionKey (the identifier of the partition) and rowkey, the partition key that must be unique within the partition.

Be careful the properties name is case sensitive and limited to 256 characters

$entity1 = @{  

    partitionKey = 'testtable'  

    rowKey = (new-guid).guid  

    name = "Valeur 1"  


$entity2 = @{  

    partitionKey = 'testtable'  

    rowKey = (new-guid).guid  

    name = "Valeur 2"  


$arrayOut = @($entity1,$entity2)  

Push-OutputBinding -Name outputTable -Value $arrayOut  


Blod container can serve as trigger, input and output binding.

Blob trigger run the function each time a new object is added in the blob container from the storage account associate with the function connection. The object content is sent to the function as a parameter.

    "name": "InputBlob", 
    "type": "blobTrigger", 
    "direction": "in", 
    "path": "triggerblob/{name}", 
    "connection": "AzureWebJobsStorage" 

The name is used as a parameter in run.ps1. By default, it’s an array of object, but a string (for json file for example), a stream or a textreader can also be used.

Path represent the path where to find the object. We can use a filter here, {name}.json

The runtime makes sure that the function is only run once when adding a new object in the container.

The parameter in run.ps1 contains the content of the object. To have information about the object itself you need to use the triggermetadata parameter.

This object contains:

BlobTrigger, the path to the object
URI, the URL to the object
Properties a BlobProperties object

Using a blob as input is similar as the trigger expect that the data from the blog don’t come from a newly blob object.

We need to give the complete path to an existing object. You can also use a pattern to find the object or the type of the trigger (ie if the message in the queue trigger contain the name of an object, you can use it as the name in path to the file "path": "samples-workitems/{queueTrigger}")

Output Blob

"name": "InputBlob",
"type": "blobTrigger",
"direction": "in",
"path": "outputblob/{rand-guid}.json",
"connection": "AzureWebJobsStorage"

Using Rand-guid let you have the random name when the runtime creates the object in the container.

Using blob trigger may be not the best way. Cold Start can delay the function for few minutes. Event Grid is another solution to use blob.

There are many other triggers and bindings available, EventGrid, CosmoDb, …. But PowerShell is still in preview and you may find some bug and limitation.

Also, the default timeout for a function is 2 minutes and 30 seconds. Even if you can change this seating it’s not a good idea. Azure Function is a serverless tools, each function needs to perform a specific task and only one task, queue messaging (Storage queue and Service bus) can be used to coordinate action between function.

Azure Function include a free grant, 1 million execution and 400000 Gb/s (the number of seconds needed to run the function * by the memory needed to run it).


Editor guide