Achraf Ben Alaya
No Result
View All Result
  • Home
  • News
  • Blog
    • blazor
    • c#
    • Cloud
      • Azure
    • docker
    • sql
    • xamarin
    • Dapr
    • Tricks, Tips and Fixes
  • Cloud
  • Motivation
  • General Tips & Fix
  • About
    • Resume
SUBSCRIBE
  • Home
  • News
  • Blog
    • blazor
    • c#
    • Cloud
      • Azure
    • docker
    • sql
    • xamarin
    • Dapr
    • Tricks, Tips and Fixes
  • Cloud
  • Motivation
  • General Tips & Fix
  • About
    • Resume
No Result
View All Result
Achraf Ben Alaya
No Result
View All Result
ADVERTISEMENT
Home Blog Cloud

Part 5-B : Using Azure DevOps, Automate Your CI/CD Pipeline and Your Deployments

Creating our CI/CD Pipelines for terraform

achraf by achraf
April 20, 2023
in Azure, Blog, Cloud
8 min read
1
Part 5-A : Using Azure DevOps, Automate Your CI/CD Pipeline and Your Deployments
0
SHARES
230
VIEWS
Share on FacebookShare on Twitter

As we continue writing this serie , we come to the part were we are going to deploy our Azure Kubernetes Service (AKS) using terraform and we will be using Azure Devops in order to deploy into Microsoft Azure .

This article is a part of a series:

  1. Part 1 : How to setup nginx reverse proxy for aspnet core apps with and without Docker compose
  2. Part 2 :How to setup nginx reverse proxy && load balancer for aspnet core apps with Docker and azure kubernetes service
  3. Part 3 : How to configure an ingress controller using TLS/SSL for the Azure Kubernetes Service (AKS)
  4. Part 4 : switch to Azure Container Registry from Docker Hub
  5. Part 5-A: Using Azure DevOps, Automate Your CI/CD Pipeline and Your Deployments
  6. Part 6 : Using Github, Automate Your CI/CD Pipeline and Your Deployments
  7. Part 7 : Possible methods to reduce your costs

    1-Creating our CI Pipeline

    in this first part we will define our CI pipeline ,the main role of this pipline is to build our terraform by validating and creating a tfplan in order to see what are we going to build .
    By validating our pipeline ,we runs checks that verify whether a configuration is syntactically valid and internally consistent, regardless of any provided variables or existing state , after it we use the terraform plan command ,that will creates an execution plan, which lets us preview the changes that Terraform plans to make to our infrastructure .
    In advanced scenarios that we may see in future articles , we will add a couple of tools like checkov in order to  scan our infrastructure configurations to find misconfigurations before they’re deployed , also tfsec which is a static analysis security scanner for our Terraform code .

    Defining our Pipline :

    name: $(BuildDefinitionName)_$(date:yyyyMMdd)$(rev:.r)
    
    trigger:
      branches:
        include:
        - dev
    #defining our agent pool and the private agent that we created .
    pool:
        name: demo-privateAgentDevOps
        demands: 
        - Agent.Name -equals DevopsAg01
    
    stages :
    
      - stage: terraform_plan
        displayName: Plan 
    
        jobs:
          - job: init_plan
            steps:
              - checkout: self
    
              - task: charleszipp.azure-pipelines-tasks-terraform.azure-pipelines-tasks-terraform-installer.TerraformInstaller@0
                displayName: 'Install Terraform'
                inputs:
                  terraformVersion: 'latest'
    
              - task: TerraformCLI@0
                displayName: 'Initialization'
                inputs:
                  command: 'init'
                  workingDirectory: '$(System.DefaultWorkingDirectory)/'
                  backendType: 'azurerm'
                  backendServiceArm: 'terrafromspn'
                  backendAzureRmResourceGroupName: 'azure-loves-terraform-2023'
                  backendAzureRmResourceGroupLocation: 'francecentral'
                  backendAzureRmStorageAccountName: 'mystaccountaccess2023'
                  backendAzureRmContainerName: 'terraform-states'
                  backendAzureRmKey: dev.tfstate
                  allowTelemetryCollection: true
                  
               # Validate our configuration
              - task: TerraformCLI@0
                displayName: 'Run terraform validate'
                inputs:
                     command: 'validate'
                     workingDirectory: '$(System.DefaultWorkingDirectory)'
                     commandOptions: 
                     allowTelemetryCollection: true 
                     environmentServiceName: 'terrafromspn'
                     backendType: azurerm
    
              # creates an execution plan
              - task: TerraformCLI@0
                displayName: 'Run Terraform Plan'
                inputs:
                      backendType: azurerm
                      command: 'plan'
                      commandOptions: '-input=false  -out .tfplan'
                      workingDirectory: '$(System.DefaultWorkingDirectory)/'
                      environmentServiceName: 'terrafromspn'
                      publishPlanResults: 'dev-plan'
     
              - script: |
                    cd $(Build.SourcesDirectory)/
                    terraform show -json .tfplan >> tfplan.json
                    # Format tfplan.json file
                    terraform show -json .tfplan | jq '.' > tfplan.json
                    # show only the changes
                    cat tfplan.json | jq '[.resource_changes[] | {type: .type, name: .change.after.name, actions: .change.actions[]}]' 
                displayName: Create tfplan.json
    
              - task: PublishBuildArtifacts@1
                displayName: 'Publish Build Artifacts'
                inputs:
                    PathtoPublish: './'
                    ArtifactName: 'dev-tfplan'
                    publishLocation: 'Container'
                    StoreAsTar: true
    
    As you can see in our pipeline there is a lot of info that are displayed and exposed , this is only for the demo purpose , but in reality we need to put all this info inside a Library and read from there .
    aks.tf
    resource "azurerm_kubernetes_cluster" "cluster01" {
      name = var.cluster_name
      #kubernetes_version  = data.azurerm_kubernetes_service_versions.current.latest_version
      location             = var.resource_group_location
      resource_group_name  = var.rg_name
      dns_prefix           = var.dns_prefix
      azure_policy_enabled = true
    
      oms_agent {
        log_analytics_workspace_id = azurerm_log_analytics_workspace.insights.id
      }
      tags = {
        Environment = var.env_name
      }
    
      default_node_pool {
        name       = var.agentpool_name
        node_count = var.agent_count
        vm_size    = var.vm_size
      }
    
      identity {
        type = var.identity
      }
    
    
    }
    
    acr.tf
    resource "azurerm_container_registry" "acr_01" {
      name                = var.container_registry_name
      resource_group_name = var.rg_name
      location            = var.resource_group_location
      sku                 = var.container_registry_sku
    }
    
    
    resource "azurerm_role_assignment" "roleforaks" {
      principal_id                     = azurerm_kubernetes_cluster.cluster01.kubelet_identity[0].object_id
      role_definition_name             = var.aks_role_assignment
      scope                            = azurerm_container_registry.acr_01.id
      skip_service_principal_aad_check = true
    }
    analytics.tf
    # resource "random_id" "log_analytics_workspace_name_suffix" {
    #   byte_length = 8
    # }
    
    resource "azurerm_log_analytics_workspace" "insights" {
      location = var.resource_group_location
      # The WorkSpace name has to be unique across the whole of azure;
      # not just the current subscription/tenant.
      name                = var.log_analytics_workspace_name
      resource_group_name = var.rg_name
      sku                 = var.log_analytics_workspace_sku
    }
    
    resource "azurerm_log_analytics_solution" "insights" {
      location              = var.resource_group_location
      resource_group_name   = var.rg_name
      solution_name         = "ContainerInsights"
      workspace_name        = azurerm_log_analytics_workspace.insights.name
      workspace_resource_id = azurerm_log_analytics_workspace.insights.id
    
      plan {
        product   = "OMSGallery/ContainerInsights"
        publisher = "Microsoft"
      }
    }
    
    
    providers.tf
    terraform {
      required_providers {
        azurerm = {
          source  = "hashicorp/azurerm"
          version = ">=3.0.0"
        }
      }
    
    
      backend "azurerm" {
        use_msi = true
      }
    }
    
    provider "azurerm" {
      features {}
      skip_provider_registration = true
    }
    variables.tf
    variable "cluster_name" {
      description = "The name for the AKS cluster"
      default     = "achrafdoingaks"
    }
    variable "env_name" {
      description = "The environment for the AKS cluster"
      default     = "dev"
    }
    
    variable "resource_group_name_prefix" {
      default     = "rg"
      description = "Prefix of the resource group name that's combined with a random ID so name is unique in your Azure subscription."
    }
    
    variable "resource_group_location" {
      default     = "francecentral"
      description = "Location of the resource group."
    }
    
    # Refer to https://azure.microsoft.com/pricing/details/monitor/ for Log Analytics pricing
    variable "log_analytics_workspace_sku" {
      default = "PerGB2018"
    }
    
    variable "log_analytics_workspace_name" {
      default = "log-dvs-aks-dev-fc-01"
    }
    
    # Refer to https://azure.microsoft.com/global-infrastructure/services/?products=monitor for available Log Analytics regions.
    variable "log_analytics_workspace_location" {
      default = "francecentral"
    }
    variable "dns_prefix" {
      default = "hostnametest"
    }
    
    variable "rg_name" {
      default = "azure-loves-terraform-2023"
    }
    
    variable "agentpool_name" {
      default = "agentpool01"
    }
    
    variable "vm_size" {
      default = "standard_b2s"
    }
    
    variable "identity" {
      default = "SystemAssigned"
    }
    
    variable "agent_count" {
      default = 1
    }
    
    variable "container_registry_name" {
      default = "crdvsaksdevfc01"
    }
    
    variable "container_registry_sku" {
      default = "Standard"
    }
    
    variable "aks_role_assignment" {
      default = "AcrPull"
    }
    

    now our repository will look like that :

    Running our CI Pipeline

     Now after defining our pipeline , it’s time to run it and see what we will have as result .
    Now that our pipeline finished running without any errors , we can see in Terraform plan task the results ,we are going to create 6 new resources , 0 to change and 0 to destroy which means we are very good .
    Even if the pipeline is not giving us any errors , I always take a look at what are we doing from creating ,changing or destroying .
    Now that we have put our CI pipeline  in place , it’s time to create our CD Pipeline which will be responsible for deploying our infra .

    2-Creating our CD Pipeline

    Azure DevOps CD pipeline for Terraform is a continuous delivery (CD) pipeline that automates the deployment of infrastructure managed by Terraform to Microsoft Azure .

    Our CD pipline will download the artifact that we have created , and deploy it to azure .

    Now , we will just add triggers .
    so what is a trigger in azure DevOps?

    In Azure DevOps, a trigger is a mechanism that automatically starts a build or release pipeline in response to a specific event, such as a code commit, a pull request, a code merge, or a new artifact version. Triggers are essential to achieve continuous integration (CI) and continuous delivery (CD) in modern software development.
    Now , everytime a new build finish and generate an artifact , a new release pipline will be created , but in my case I do not prefer auto deployment , I always choose to add Pre-deployment approvals .

    Pre-deployment approvals are a way to ensure that the right people have reviewed and approved the changes before they are deployed, which can help you prevent mistakes, reduce risks, and improve the overall quality of your software.

    Here are some reasons why you should use pre-deployment approvals in Azure DevOps:

    1. Compliance: Pre-deployment approvals can help you comply with regulatory requirements, industry standards, or internal policies that mandate a formal approval process for production changes. Pre-deployment approvals provide an audit trail of who approved the changes, when, and why, which can help you demonstrate compliance and reduce the risk of non-compliance.
    2. Risk management: Pre-deployment approvals can help you mitigate the risks associated with deploying code changes to production or critical environments, such as data loss, service interruption, or security breaches. Pre-deployment approvals enable you to review the changes and assess their impact on the environment, identify potential issues, and take appropriate actions before the changes are deployed.
    3. Quality assurance: Pre-deployment approvals can help you improve the overall quality of your software by ensuring that the changes are properly tested, reviewed, and validated before they are deployed. Pre-deployment approvals can help you catch defects, errors, or vulnerabilities that may have been missed during development or testing, and ensure that the changes are aligned with the business requirements and user expectations.
    4. Collaboration: Pre-deployment approvals can help you foster collaboration and communication among the different teams involved in the software delivery process, such as developers, testers, operations, and stakeholders. Pre-deployment approvals provide a centralized place to discuss the changes, share feedback, and resolve conflicts, and can help you build trust, accountability, and transparency across the teams.Overall, pre-deployment approvals in Azure DevOps are a best practice in modern software delivery that can help you ensure the reliability, security, and performance of your applications in production. By using pre-deployment approvals, you can reduce the risk of downtime, data loss, or security breaches, and improve the user experience and business outcomes.Now let’s see what we have :
      A new release pipline was launched and let’s approve in order to deploy our infra .
      As you can see, we have now deployed our resources and are prepared to use Kubernetes.
      At the conclusion of this blog article, I realized that I had forgotten to describe how to automatically deploy our application to Kubernetes. This will be covered in a subsequent blog post.
ShareTweet
Previous Post

Part 5-A : Using Azure DevOps, Automate Your CI/CD Pipeline and Your Deployments

Next Post

15 E-books that focus on learning Microsoft Azure Cloud

Related Posts

 Azure Container Apps : User-Assigned Identity, ACR, and Key Vault
Azure

 Azure Container Apps : User-Assigned Identity, ACR, and Key Vault

November 12, 2023
13
Navigating Azure Identities: System-Assigned vs. User-Assigned with Real-World Scenarios
Azure

Navigating Azure Identities: System-Assigned vs. User-Assigned with Real-World Scenarios

October 9, 2023
64
Revolutionizing IP Management in Azure with IPAM: Explore, Deploy, and Master!
Azure

Revolutionizing IP Management in Azure with IPAM: Explore, Deploy, and Master!

October 4, 2023
100
Configuring GitHub Advanced Security for Azure DevOps: A Must-Have in Today’s DevOps Landscape
Azure

Configuring GitHub Advanced Security for Azure DevOps: A Must-Have in Today’s DevOps Landscape

September 28, 2023
84
The Significance of Azure DevSecOps: Best Practices for Securing Your Pipelines
Azure

The Significance of Azure DevSecOps: Best Practices for Securing Your Pipelines

August 17, 2023
161
Navigating the Alphabet Soup: Unraveling Microsoft Acronyms
Azure

Navigating the Alphabet Soup: Unraveling Microsoft Acronyms

July 16, 2023
82
Next Post
15 E-books that focus on learning Microsoft Azure Cloud

15 E-books that focus on learning Microsoft Azure Cloud

Comments 1

  1. Pingback: Part 5-C : Using Azure DevOps, Automate Your CI/CD Pipeline and Your Deployments – achraf ben alaya

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Terraform

Certifications

Microsoft certified trainer (MCT)

Recommended

Create a Linux VM with infrastructure in Azure using Terraform

Create a Linux VM with infrastructure in Azure using Terraform

August 30, 2020
2.1k
Streamlining Website Editing on My Local Machine with Docker Compose and WordPress

Streamlining Website Editing on My Local Machine with Docker Compose and WordPress

July 1, 2023
61
Recover a deleted storage account azure

Recover a deleted storage account azure

December 14, 2020
732
Animations with Lottie in Xamarin Forms

Animations with Lottie in Xamarin Forms

April 26, 2020
921
How To Host a React Static Website on Azure

How To Host a React Static Website on Azure

September 27, 2020
1k
Hello Microsoft Graph !

Hello Microsoft Graph !

February 25, 2021
319
Facebook Twitter LinkedIn Youtube
 Azure Container Apps : User-Assigned Identity, ACR, and Key Vault

 Azure Container Apps : User-Assigned Identity, ACR, and Key Vault

November 12, 2023
Navigating Azure Identities: System-Assigned vs. User-Assigned with Real-World Scenarios

Navigating Azure Identities: System-Assigned vs. User-Assigned with Real-World Scenarios

October 9, 2023
Revolutionizing IP Management in Azure with IPAM: Explore, Deploy, and Master!

Revolutionizing IP Management in Azure with IPAM: Explore, Deploy, and Master!

October 4, 2023

Categories

  • Apps (1)
  • Azure (52)
  • blazor (2)
  • Blog (76)
  • c# (7)
  • Cloud (50)
  • Dapr (4)
  • docker (4)
  • Games (1)
  • General Tips & Fix (1)
  • Kubernetes Service (AKS) (1)
  • motivation (2)
  • Motivation (3)
  • News (9)
  • Resume (1)
  • sql (4)
  • Terrafrom (1)
  • Tricks, Tips and Fixes (3)
  • xamarin (5)
No Result
View All Result
  • Home
  • News
  • Blog
    • blazor
    • c#
    • Cloud
      • Azure
    • docker
    • sql
    • xamarin
    • Dapr
    • Tricks, Tips and Fixes
  • Cloud
  • Motivation
  • General Tips & Fix
  • About
    • Resume