Designing Azure Functions for PowerShell

Page content

This post is part of a series about Azure Functions and PowerShell. Check out the list of other posts in the series!


Azure Functions support for PowerShell has been generally available since November 4, 2019 and is working great!

In my last post on designing Azure Function App for production I showed how to set up and configure a Function App to run PowerShell in Azure. In this post I’ll cover how to deploy our first function to it and things to consider when developing the PowerShell code that will run in the Function App.

Start with an example!

There is a great tutorial on building a function in Azure using Visual Studio Code. This will get you started. Once we have a function deployed to Azure, there are some things I want to delve a little deeper into.

Set scriptFile to get rid of all those run.ps1

I can sometimes be a little all over the place when I’m working. I’m also one of those people that suddenly end up with 800 browser tabs or more (OK, I’m also really lousy at restarting my computer…).

Now what does this have to do with Azure Functions?! Well, that also means that when I’m developing funcitons, I tend to have at least five files named run.ps1 open. This ends up being really confusing and wierd!

lots of run.ps1 tabs

Of course there is a solution to this. Each function has its own function.json. In that file we can add a property called scriptFile and give it the path to a script that should execute. The path has to be relative to function.json, but the actual file does not need to be in the same folder.

I usually just name my file the same as my function and keep it in the same folder, that way it’s easier to keep track of which files belongs where. Here is an example of renaming run.ps1 to HttpTrigger1.ps1:

  "bindings": [
  "scriptFile": "HttpTrigger1.ps1"

Log level

Logging from PowerShell is really simple, just use the cmdlets we know and love for output, like write-error/warning/information/host/output/debug/verbose. The function runtime will then pick up the different output streams and send them to Application Insights. We can choose which streams to pick up by setting the log level to any of Error (Only error stream), Warning (error and warning stream), Information (error, warning as well as output and information streams), Debug(all previous plus debug stream) or Trace that also adds the verbose stream and progress messages written using Write-Progress.

Log level is set in the file named host.json in the root of our Function App code. By configuring the property logging.logLevel we can specify a default value that will appy for all functions in the app and specify “Function.MyFunction” to override the setting for a specific function called MyFunction.

  "logging": {
    "logLevel": {
      "Function.HttpTrigger1": "Trace",
      "default": "Warning"

Read more about logging in the documentation.

PowerShell Profile

Also in the root of our project is a file called profile.ps1. This works just like profile.ps1 on our computers, it gets dot-sourced each time a new runspace is created. By default there is a small if-statement that looks for the existance of an environment variable MSI_SECRET and the Az.Accounts module. MSI_SECRET is an automagic environment variable that Azure will inject to your Function App if you enabled it to use a managed identity. The MSI_SECRET variable contains a secret header that can be used to call an internal token service your functions can use to get access tokens using the managed identity.

Azure PowerShell has built in support for this. If we have the Az.Accounts module install, we can just call “Connect-AzAccount -Identity” to authenticate to Azure using the managed identity of our Function App. How cool isn’t that?

However, if you don’t intend your functions to call the Azure API, I would highly recommend to remove this if statement, even if it only takes a few milliseconds evaluate the condition, that is milliseconds that are added to the cold-start of our function call. And even worse, if a managed identity is configured (which it often is) and we have the Az.Accounts module available, our function will actually sign in to Azure, which might add several seconds to the cold start time.

A few simple tests I performed showed that just testing if Az.Accounts existed took somewhere just below 2 seconds and actually calling Connect-AzAccount took around 12 seconds! Now that is 12 seconds you don’t want to spend unless you really need it!

If you need to authenticate to Azure in your Function App running on the consumption tier and you have a lot of cold starts, here is a super lightweight way to get an access token to call the ARM API instead of using Azure PowerShell:

$BaseURI = ''
$URI = "${Env:MSI_ENDPOINT}?resource=${BaseURI}&api-version=2017-09-01"
$Response = Invoke-RestMethod -Method Get -Headers @{Secret = $Env:MSI_SECRET } -Uri $URI
$Env:Token = $Response.access_token

This usually executes in around 0.2 seconds!

This method also works for connecting to other resources as long as your managed identity has access to them. Just replace the BaseURI with the identifier of the resouce you want to connect to.

Here is an example of how you can connect Microsoft.Graph PowerShell module from an Azure Funciton:

$BaseURI = ''
$URI = "${Env:MSI_ENDPOINT}?resource=${BaseURI}&api-version=2017-09-01"
$Response = Invoke-RestMethod -Method Get -Headers @{Secret = $Env:MSI_SECRET } -Uri $URI
$Env:Token = $Response.access_token
Connect-MgGraph -AccessToken $Env:Token -TenantId $TenantId

And while we’re at it, if it’s speed you’re after, look into calling the ARM API using Invoke-RestMethod instead of using Azure PowerShell, there are lots of precious seconds to save at the cost of some more coding.

Using Modules

As always when working with PowerShell, I try to put as much of my code as I can into reusable modules. These modules that I write myself together with any other module that I want to use in my functions need to be installed in my Function App.

There are essentially two ways of achieving this, deploy them myself together with my code or let the Function App download them from the PowerShell gallery. Letting the Function App download the modules and install them for me is handled by a feature called managed dependency. Let’s look how it works.

Mangaged dependency can be enabled by setting managedDependency.enabled to true in host.json. Doing this let’s us create a requirements.pd1 file in the root of the project. This file is just a hashtable with module names and required versions. By default we get a dependency on the module ‘Az’ of version ‘4.*’ at the time of writing this in September of 2020. Any module dependency we set in dependencies.psd1 will be downloaded from PowerShell gallery when the Funciton App first starts, cauing a considerable waiting time.

Managed dependencies sounds like a tempting feature at first glance, but after a deeper look, I feel like it gives me less control. I prefer downloading the modules I need manually and deploy them together with the rest of my code. This way I can be sure of what I’m running, I don’t need to have a dependency on the PowerShell gallery and I don’t need to worry about long cold starts when the Function App downloads my required modules. As an added benifit, packaging all modules with the rest of my function code makes it easier to run and debug locally since I have all my dependencies there as well.

So how do we install modules manually? Easy! Just create a folder colled Modules in the root of our functions project. The functions runtime will automatically look for modules here. If I want to install public modules I just run Save-Module and point it to my modules folder, like this:

Save-Module Az.Accounts -Path .\Modules

Simple as that!

Here is an example of how host.json could look:

  "version": "2.0",
  "managedDependency": {
    "enabled": false
  "extensionBundle": {
    "id": "Microsoft.Azure.Functions.ExtensionBundle",
    "version": "[1.*, 2.0.0)"
  "logging": {
    "logLevel": {
      "Function.HttpTrigger1": "Trace",
      "default": "Warning"

Setting configurable values using Application Settings

When I create a Function App, it is not unusual that I need to set some configuration that is unique to the environment where it is running right now. It might be a tag name, a tenant id, an organization name or even a secret like an API-key, a certificate, or, sometimes even a username and password. I definitly don’t want this kind of stuff hard-coded in my code!

This is where application settings can be used. If you read my previous post on Designing Azure Functions for Production you are well aware of application settings for configuring the Function App. These same settings can be used to inject configuration settings and secrets from the platform into my Function App. The value of any application setting will be available inside my Function App as an environment variable.

So, if I need to inject a secret token, I can put that token in my Key Vault, add an application setting named MyToken and reference my Key Vault secret. Then I can access my secret token from PowerShell just by reading the environmentvariable $Env:MyToken.

Of course, all secrets and certificates should be stored in the Key Vault. I tend to also store configuration settings in Key Vault just to keep everything at the same place.

More examples

Microsoft has a great repository of examples on how to build Functions on GitHub


To summarize:

  • Make sure you use the latest version of PowerShell
  • Consider renaming run.ps1
  • Manage your own dependencies
  • Use Key Vault!

Do you have any other tips and tricks or important information you always use when writing Functions using PowerShell?
Let me know in the comments!