Terraform CI-CD With Azure DevOps

Let’s consider a scenario in which you are deploying your infrastructure using a Terraform code (infrastructure-as-code) which is stored in a remote git repository. Now working in an organization you need to make sure that all your deployments are always tracked without an exception, an add-on to that whether your Terraform code is following your security and compliance policies or not. Or maybe what is the monthly cost that you can expect with that infra and whether it lies under your budget or not. You may also want to take note that all your resources are being created in the same region… etc… etc.

Sounds magical right !!! We all know that these concerns are very important when you’re looking for a highly consistent, fully tracked, and automated approach. That’s why in this article we are going to look for a simple step-by-step way to automate and streamline our Terraform code using Azure DevOps (ADO).

Soo… Let’s Get Started !!!

First of all, we need to know what is Terraform & Azure DevOps.

Talking About Terraform: HashiCorp Terraform is an infrastructure as code tool that lets you define both cloud and on-prem resources in human-readable configuration files that you can version, reuse, and share. You can then use a consistent workflow to provision and manage all of your infrastructure throughout its lifecycle. Terraform can manage low-level components like compute, storage, and networking resources, as well as high-level components like DNS entries and SaaS features.

Terraform Workflow

If you want to learn more about terraform you can click here.

Talking about Azure DevOps: Azure DevOps provides developer services for allowing teams to plan work, collaborate on code development, and build and deploy applications. Azure DevOps supports a collaborative culture and set of processes that bring together developers, project managers, and contributors to develop software. It allows organizations to create and improve products at a faster pace than they can with traditional software development approaches.

DevOps lifecycle in Azure Devops

If you want to learn more about Azure DevOps click here.

Azure Pipeline For Terraform

Okay !!! I know that’s a lot of information. Now let’s get back to the point… the question we all started with, “How the hell are we going to achieve all this and that too in a single pipeline?” Well, the answer is very simple, by using different tools dedicated to a particular task.

This is the broad architecture of the pipeline that we are going to create.

Pre-requisites:

No matter whether we are deploying our infrastructure into Azure Cloud Services or Amazon Web Services (AWS). All we need are the following checklist:

  • Active Cloud Service (Azure/AWS)
  • Azure DevOps Account
  • Terraform Code to deploy
  • A Linux machine (VM or EC2) for agent pool
  • Docker
  • Storage Account (Azure Blob Container or AWS S3)

Tools used:

  1. TFsec

TFsec is a static analysis security scanner for your Terraform code.

TFsec takes a developer-first approach to scan your Terraform templates; using static analysis and deep integration with the official HCL parser ensures that security issues can be detected before your infrastructure changes take effect.

  1. TFlint

TFlint is a framework and each feature is provided by plugins, the key features are as follows:

  • Find possible errors (like illegal instance types) for Major Cloud providers (AWS/Azure/GCP).
  • Warn about deprecated syntax and unused declarations.
  • Enforce best practices and naming conventions.
  1. InfraCost

Infracost shows cloud cost estimates for Terraform. It lets DevOps, SRE, and engineers see a cost breakdown and understand costs before making changes, either in the terminal or in pull requests. It can also show us the difference between our present state and desired state.

Rest all the requirements we can fulfill within our pipeline itself

Build Pipeline steps

Assuming that we have already configured an agent in the agent pool which is going to assist us in executing all the commands to achieve our pipeline goals. Also, try importing your terraform code into your Azure Repos as it’ll benefit you in a very unique way that we’ll find out further in this article in the bonus section.

If you what to know how to configure an agent you can click here

You can click here to follow the steps to import a repo.

Part 1: Installing Dependencies

Steps to install dependencies

All we need to do in this part is to download and install all the dependencies required in your pipeline.

Here we will use the docker images for different tasks, eg:

  • Terraform security compliance: Tfsec
  • Terraform Linting: Tflint
  • Infrastructure Cost Estimation & Cost Difference: Infracost

Alongside you can try this YAML format of the pipeline.

  - task: Bash@3
    displayName: Install Docker
    enabled: False
    inputs:
      targetType: inline
      script: >-
        sudo apt update
        sudo apt install apt-transport-https ca-certificates curl software-                           properties-common -y
        curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
        sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu bionic stable
        sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu bionic stable"
        sudo apt update
        apt-cache policy docker-ce
        sudo apt install docker-ce -y
        sudo systemctl start docker

  - task: Bash@3
    displayName: Install Azure CLI
    enabled: False
    inputs:
      targetType: inline
      script: "sudo apt-get update \nsudo apt-get install ca-certificates curl apt-transport-https lsb-release gnupg -y\ncurl -sL https://packages.microsoft.com/keys/microsoft.asc |\n    gpg --dearmor |\n    sudo tee /etc/apt/trusted.gpg.d/microsoft.gpg > /dev/null\nAZ_REPO=$(lsb_release -cs)\necho \"deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ $AZ_REPO main\" |\n    sudo tee /etc/apt/sources.list.d/azure-cli.list\nsudo apt-get update\nsudo apt-get install azure-cli -y\nsudo apt install unzip -y"

  - task: TerraformInstaller@0
    displayName: Install Terraform 1.1.8
    inputs:
      terraformVersion: 1.1.8

  - task: Bash@3
    displayName: Pulling Required Docker Images
    enabled: False
    inputs:
      targetType: inline
      script: >
        # TFSEC
        sudo docker pull tfsec/tfsec:v1.13.2-arm64v8
        # TFLINT
        sudo docker pull ghcr.io/terraform-linters/tflint:v0.35.0
        # InfraCost
        sudo docker pull infracost/infracost:0.

For more detailed info click here.

Part 2: Terraform Initializing & Planning

This is one of the most simple and well-known step of the whole pipeline

Steps for Terraform init, validate & plan

In this part, we’ll initialize, validate and plan our terraform code and store the output into a file using -out=plan.out flag.

  - task: TerraformTaskV2@2
    displayName: 'Terraform : INIT'
    inputs:
      backendServiceArm: a575**********************4bcc71
      backendAzureRmResourceGroupName: ADOagent_rg
      backendAzureRmStorageAccountName: terrastoragestatesreport
      backendAzureRmContainerName: statefile
      backendAzureRmKey: terraform.tfstate
  - task: TerraformTaskV2@2
    displayName: 'Terraform : VALIDATE'
    inputs:
      command: validate
  - task: TerraformTaskV2@2
    displayName: 'Terraform : PLAN ( For Cost Optimization )'
    inputs:
      command: plan
      commandOptions: -lock=false -out=plan.out
      environmentServiceNameAzureRM: a575**********************4bcc71

For more detailed steps click here.

Part 3: Heart of Our Pipeline: Terraform Security Compliance, Linting, Cost Estimation & Cost Difference

Steps to secure, lint and estimate the cost of terraform code

Now using the above-mentioned tools we can achieve these tasks Terraform Security Compliance, Linting, Cost Estimation & Cost Difference in a bash task.

Though you don’t need custom settings while linting or cost calculating but you can definitely use a custom checks file for Terraform Compliance step.

A custom check file of Tfsec will look like this:

---
checks:
  - code: CUS001
    description: Custom check to ensure the Name tag is applied to Resources Group Module 
    impact:  By not having Name Tag we can't keep track of our Resources
    requiredTypes:
      - module
    requiredLabels:
      - resource_group
    severity: MEDIUM
    matchSpec:
     name: tag_map
     action: contains
     value: Name
    errorMessage: The required Name tag was missing

  - code: CUS002
    description: Custom check to ensure the Name tag is applied to Resources Group Module
    impact:  By not having Environment Tag we can't keep track of our Resources
    requiredTypes:
      - module
    requiredLabels:
      - resource_group
    severity: CRITICAL
    matchSpec:
      name: tag_map
      action: contains
      value: Environment
    errorMessage: The required Environment tag was missing

  - code: CUS003
    description: Custom check to ensure Resource Group is going to be created in Australia East region
    impact:  By not having our resource in Australia East we might get some latency
    requiredTypes:
      - module
    requiredLabels:
      - resource_group
    severity: MEDIUM
    matchSpec:
     name: resource_group_location
     action: equals
     value: "Australia East"
    errorMessage: The required "Australia East" location was missing

YAML Pipeline for the task:

  - task: Bash@3
    displayName: 'Terraform : TFSEC'
    condition: succeededOrFailed()
    enabled: False
    inputs:
      targetType: inline
      script: sudo docker run --rm -v "$(pwd):/src" aquasec/tfsec /src --tfvars-file /src/terraform.tfvars
  - task: Bash@3
    displayName: 'Terraform : Linting'
    condition: succeededOrFailed()
    enabled: False
    inputs:
      targetType: inline
      script: >
        sudo docker run --rm -v $(pwd):/data -t ghcr.io/terraform-linters/tflint
  - task: Bash@3
    displayName: 'Terraform : Cost Estimation'
    condition: succeededOrFailed()
    enabled: False
    inputs:
      targetType: inline
      script: "terraform show -json plan.out > plan.json\n\nsudo docker run --rm   -e INFRACOST_API_KEY=$(INFRACOST_API_KEY)   -v \"$(pwd):/src\" infracost/infracost breakdown --path  /src/plan.json --show-skipped \n"
  - task: Bash@3
    displayName: 'Terraform : Cost Difference'
    condition: succeededOrFailed()
    enabled: False
    inputs:
      targetType: inline
      script: >
        sudo docker run --rm   -e INFRACOST_API_KEY=$(INFRACOST_API_KEY)   -v "$(pwd):/src" infracost/infracost diff --path  /src/plan.json --show-skipped > $(Build.DefinitionName)-cost-diff-$(Build.BuildNumber)

Here we also need to generate an API Key for the Infracost app, to learn how to do it you can click here.

Part 4: Generating & Uploading Logs

The need for this step is to store our logs generated in previous steps into a storage account and make sure that we are not going to lose them.

Steps to generate & upload the logs file
  - task: Bash@3
    displayName: Generating Logs
    condition: succeededOrFailed()
    enabled: False
    inputs:
      targetType: inline
      script: >
        # Creating Logs Of Tfsec

        sudo docker run --rm -v "$(pwd):/src" aquasec/tfsec /src --tfvars-file /src/terraform.tfvars > $(Build.DefinitionName)-tfsec-$(Build.BuildNumber)

        cat $(Build.DefinitionName)-tfsec-$(Build.BuildNumber)


        #Creating Logs Of Cost Estimation

        sudo docker run --rm   -e INFRACOST_API_KEY=$(INFRACOST_API_KEY)   -v "$(pwd):/src" infracost/infracost breakdown --path  /src/plan.json --show-skipped --format html > $(Build.DefinitionName)-cost-$(Build.BuildNumber).html

        cat $(Build.DefinitionName)-cost-$(Build.BuildNumber).html


        #Creating Logs Of Cost Diffrence

        sudo docker run --rm   -e INFRACOST_API_KEY=$(INFRACOST_API_KEY)   -v "$(pwd):/src" infracost/infracost diff --path  /src/plan.json --show-skipped > $(Build.DefinitionName)-cost-diff-$(Build.BuildNumber)

        cat $(Build.DefinitionName)-cost-diff-$(Build.BuildNumber)
  - task: AzureCLI@2
    displayName: 'Upload tfsec file '
    condition: succeededOrFailed()
    enabled: False
    inputs:
      connectedServiceNameARM: a575*********************c71
      scriptType: bash
      scriptLocation: inlineScript
      inlineScript: >
        az storage blob upload --file $(Build.DefinitionName)-tfsec-$(Build.BuildNumber) --name $(Build.DefinitionName)-tfsec-$(Build.BuildNumber) --account-name terrastoragestatesreport --container-name report
      cwd: $(Pipeline.Workspace)/s
  - task: AzureCLI@2
    displayName: 'Upload cost file '
    condition: succeededOrFailed()
    enabled: False
    inputs:
      connectedServiceNameARM: a575*********************c71
      scriptType: bash
      scriptLocation: inlineScript
      inlineScript: az storage blob upload --file $(Build.DefinitionName)-cost-$(Build.BuildNumber).html --name $(Build.DefinitionName)-cost-$(Build.BuildNumber).html --account-name terrastoragestatesreport --container-name report
      cwd: $(Pipeline.Workspace)/s
  - task: AzureCLI@2
    displayName: Upload cost diff file
    condition: succeededOrFailed()
    enabled: False
    inputs:
      connectedServiceNameARM: a575*********************c71
      scriptType: bash
      scriptLocation: inlineScript
      inlineScript: az storage blob upload --file $(Build.DefinitionName)-cost-diff-$(Build.BuildNumber) --name $(Build.DefinitionName)-cost-diff-$(Build.BuildNumber) --account-name terrastoragestatesreport --container-name report
      cwd: $(Pipeline.Workspace)/s

For a detailed description of this part, you can click here.

Part 5: Generating Artifacts

In this step, we’ll generate two artifacts, one named Release which will trigger the Release pipeline, and the other one named Repot which will publish the reports for our output files of Compliance, Cost Estimation & Cost Difference.

Steps to generate the artifacts
  - task: Bash@3
    displayName: Generating Logs
    condition: succeededOrFailed()
    enabled: False
    inputs:
      targetType: inline
      script: >
        # Creating Logs Of Tfsec

        sudo docker run --rm -v "$(pwd):/src" aquasec/tfsec /src --tfvars-file /src/terraform.tfvars > $(Build.DefinitionName)-tfsec-$(Build.BuildNumber)

        cat $(Build.DefinitionName)-tfsec-$(Build.BuildNumber)


        #Creating Logs Of Cost Estimation

        sudo docker run --rm   -e INFRACOST_API_KEY=$(INFRACOST_API_KEY)   -v "$(pwd):/src" infracost/infracost breakdown --path  /src/plan.json --show-skipped --format html > $(Build.DefinitionName)-cost-$(Build.BuildNumber).html

        cat $(Build.DefinitionName)-cost-$(Build.BuildNumber).html


        #Creating Logs Of Cost Diffrence

        sudo docker run --rm   -e INFRACOST_API_KEY=$(INFRACOST_API_KEY)   -v "$(pwd):/src" infracost/infracost diff --path  /src/plan.json --show-skipped > $(Build.DefinitionName)-cost-diff-$(Build.BuildNumber)

        cat $(Build.DefinitionName)-cost-diff-$(Build.BuildNumber)
  - task: AzureCLI@2
    displayName: 'Upload tfsec file '
    condition: succeededOrFailed()
    enabled: False
    inputs:
      connectedServiceNameARM: a575*********************c71
      scriptType: bash
      scriptLocation: inlineScript
      inlineScript: >
        az storage blob upload --file $(Build.DefinitionName)-tfsec-$(Build.BuildNumber) --name $(Build.DefinitionName)-tfsec-$(Build.BuildNumber) --account-name terrastoragestatesreport --container-name report
      cwd: $(Pipeline.Workspace)/s
  - task: AzureCLI@2
    displayName: 'Upload cost file '
    condition: succeededOrFailed()
    enabled: False
    inputs:
      connectedServiceNameARM: a575*********************c71
      scriptType: bash
      scriptLocation: inlineScript
      inlineScript: az storage blob upload --file $(Build.DefinitionName)-cost-$(Build.BuildNumber).html --name $(Build.DefinitionName)-cost-$(Build.BuildNumber).html --account-name terrastoragestatesreport --container-name report
      cwd: $(Pipeline.Workspace)/s
  - task: AzureCLI@2
    displayName: Upload cost diff file
    condition: succeededOrFailed()
    enabled: False
    inputs:
      connectedServiceNameARM: a575*********************c71
      scriptType: bash
      scriptLocation: inlineScript
      inlineScript: az storage blob upload --file $(Build.DefinitionName)-cost-diff-$(Build.BuildNumber) --name $(Build.DefinitionName)-cost-diff-$(Build.BuildNumber) --account-name terrastoragestatesreport --container-name report
      cwd: $(Pipeline.Workspace)/s

For a detailed description of this part, you can click here.

Release Pipeline Steps

Our Build Step will generate an artifact named Release which will contain all the terraform files required to apply our desired configuration. Also after all the checks and validations, we’ve found out that there is no error in our code and it is exactly what we have desired, so we will allow the Continuous Deployment for our Release Pipeline.

Release pipeline workflow

Part 1: Auto Approval For Terraform Apply

Steps to apply to terraform code

In this step, we will simply apply our terraform code and keep this stage as Auto-Approved.

steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller@0
  displayName: 'Install Terraform 1.1.8'
  inputs:
    terraformVersion: 1.1.8

- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV2@2
  displayName: 'Terraform : INIT'
  inputs:
    workingDirectory: '$(System.DefaultWorkingDirectory)/_IAC-CI/release'
    backendServiceArm: 'Opstree-PoCs (4c9***************************f3c)'
    backendAzureRmResourceGroupName: 'ADOagent_rg'
    backendAzureRmStorageAccountName: terrastoragestatesreport
    backendAzureRmContainerName: statefile
    backendAzureRmKey: terraform.tfstate

- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV2@2
  displayName: 'Terraform : APPLY'
  inputs:
    command: apply
    workingDirectory: '$(System.DefaultWorkingDirectory)/_IAC-CI/release'
    commandOptions: '--auto-approve'
    environmentServiceNameAzureRM: 'Opstree-PoCs (4c9***************************f3c)'

To know more about this stage click here.

Part 2: Manual Approval For Terraform Destroy

Here in our Terraform Destroy pipeline, we will configure it for manual approval as it is going to be very sensitive & secured. An unnecessarily or unwanted destroyed infrastructure can cause a huge loss of time, money, resources, backup & data. So we’ll keep it highly secured and limit the access to reliable users only.

Steps to destroy terraform code
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller@0
  displayName: 'Install Terraform 1.1.8'
  inputs:
    terraformVersion: 1.1.8

- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV2@2
  displayName: 'Terraform : INIT'
  inputs:
    workingDirectory: '$(System.DefaultWorkingDirectory)/_IAC-CI/release'
    backendServiceArm: 'Opstree-PoCs (4c9***************************f3c)'
    backendAzureRmResourceGroupName: 'ADOagent_rg'
    backendAzureRmStorageAccountName: terrastoragestatesreport
    backendAzureRmContainerName: statefile
    backendAzureRmKey: terraform.tfstate

- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV2@2
  displayName: 'Terraform : APPLY'
  inputs:
    command: apply
    workingDirectory: '$(System.DefaultWorkingDirectory)/_IAC-CI/release'
    commandOptions: '--auto-approve'
    environmentServiceNameAzureRM: 'Opstree-PoCs (4c9***************************f3c)'

To know more about this stage click here.

Bonus: Branching Policy

We need branching policies in order to save our main/master branch from unwanted commits. To make any changes to our main/master branch we need to Merge a branch, after committing changes, into the main/master branch using Pull Request. Also, remember that our Terraform-CD or Release branch will only be triggered if Terraform-CI or Build branch was triggered from the main branch.

Branching policy in Azure Repos

To learn how to configure the branching policy in Azure DevOps click here.

Conclusion

So in this article, we get to learn about how to automate and streamline our terraform code deployment using Azure DevOps and use tools like Tfsec for security and compliance, Tflint for linting terraform code and Infracost for Cost Estimation & Cost Difference. Along with that we also learned how to upload our logs into Blob Containers, publish artifacts and make a release pipeline using it which will have different stages for Terraform apply and terraform destroy.

Content References Reference 1, Reference 2
Image ReferencesImage 1, Image 2


Blog Pundit: Bhupender Rawat and Sandeep Rawat

Opstree is an End to End DevOps solution provider

Connect Us

2 thoughts on “Terraform CI-CD With Azure DevOps”

  1. Atlantis is also a good option to use for terraform, include tfsec or checkov with infracost in atlantis workflow.

Leave a Reply