Deploy a Flex Consumption Function App with Terraform
As of v4.21.0 of the AzureRM provider, there is now a native
azurerm_function_app_flex_consumption
resource for deploying Flex Consumption Function Apps.
2023
Access Terraform Private Modules in GitHub Actions
In a recent project we used GitHub Actions to deploy our Terraform code. While not the best way to deploy Terraform, we had it working nicely.
One of the biggest challenges we encountered was how to download the private Terraform modules we had created. In a GitHub Actions workflow you can specify the permissions that the runner should be granted. However, these permissions are scoped to the repository that the Action is running on, and it is not possible to add additional repos to the permission set.
Authenticating Terraform with a GitHub App
In my current role, I configured Terraform to manage our GitHub organisation. As with all providers, we need need to provide credentials for authentication. I didn’t want to use an access token, as they are tied to an individual user and will cause breakage should the user depart the organisation. Thankfully, GitHub supports using an application for authentication.
Create the GitHub Application
The first step in the process is to create a new GitHub application. While this can be done either in a personal account or within an organisation, I recommend doing this within the organisation. That way, if someone leaves the organisation the application doesn’t go with.
Grant Admin Consent for an Azure AD application with Terraform
One challenge we often run into when provisioning Azure AD applications with Terraform is a need to grant admin consent for API permissions. Sadly there is not a native resource within Terraform to make this happen, however with some creative use of provisioners (yes, I feel bad about it too) we can ensure that admin consent is granted for our applications.
To start with, we deploy our Azure AD application as normal. As part of the configuration, we also assign the required API permissions.
2022
Using Table Storage as an Alternative to Remote State
Terraform is a fantastic tool for Infrastructure as Code. From the YAML-like HCL syntax (no JSON!), to importing files (linting JSON files FTW!), to retrieving the results of previous runs to link resources, Terraform has made a massive difference in my work. However, like all technologies, it is not without its weaknesses. Terraform uses state files to keep track of what the world looked like when it last ran, which is wonderful for identifying drift. The default pattern is to use these state files for passing data between Terraform modules. But this is actually an anti-pattern, for HashiCorp recommend not using remote state for passing data, in large part because to read the outputs from a state file the caller must have full access to read the entire remote state file, which include secrets they probably shouldn’t be allowed to access.
Deploying Terraform via a DevOps Pipeline
Not everyone is privileged to be able to use Terraform Cloud for deploying their Terraform infrastructure. This means that teams need to use their existing DevOps tooling to deploy their infrastructure via Terraform.
While I’ve seen many examples of pipelines for deploying Terraform code with various services, it felt like something was missing. Most example pipelines were designed to just run once a code review had occurred, and often would automatically deploy the changed code without any intervention. This wasn’t going to fly for us in a recent project. We needed a more robust plan for deployment, one that would cater for not only deployment of the infrastructure, but an opportunity to wait for approval of a specific plan, plus checks to make sure that the newly-committed code was up to standard.