Data Factory CI/CD

In this article we’ll review how to implement continuous integration and continuous delivery on Azure Data Factory using Azure DevOps. We’ll be covering how to setup source control, and how to build & deploy ARM templates, including the new auto-validating and auto-publishing capabilities that Microsoft just introduced.

We’ll start with source control setup, then we’ll continue with continuous integration (build pipeline), continuous delivery (release pipeline) and will finish with a test run.

Setup source control

Login to Azure DevOps open your project and navigate to Project settings > Repositories > Create

Leave the repository type to “Git”, type a name for the repository that makes sense for you and click “Create”.

Now login to Azure Data Factory studio and navigate to Manage > Git configuration > Configure. Select Azure DevOps Git and your Azure Active Directory and click Continue.

Then you’ll have to select the DevOps organization, project name and repository name. For collaboration branch leave it as “main” and for publish branch as “adf_publish”. You can also import existing resources to the repository in case you already have objects you want to import. Click “Apply” when ready, it’ll take few seconds to setup.

And now you have source control enabled for your Data Factory.

Setup Continuous Integration

Go to your main branch and create new file “package.json” under “build” directory.

And commit the following code, which will enable the functionalities for auto-validation and auto-publishing.:

        "build":"node node_modules/@microsoft/azure-data-factory-utilities/lib/index"

Now go to Pipelines > Pipelines > New folder and type “ADF” and click “Create” (or something that would make sense to you, you can also skip this step if you don’t want to create the pipeline under a folder.

Now click on “Create a new pipeline here.”

Select “Azure Repos Git”

Select “data_factory”

Select “Starer pipeline”

And finally type the following code:

    - main

  vmImage: 'ubuntu-latest'

- checkout: self
  path: src

- checkout: git://IvoTalksTech/IvoTalksTech

- task: NodeTool@0
    versionSpec: '10.x'
  displayName: 'Install Node.js'

- task: Npm@1
    command: 'install'
    workingDir: '$(Build.Repository.LocalPath)/build' 
    verbose: true
  displayName: 'Install npm package'

- task: Npm@1
    command: 'custom'
    workingDir: '$(Build.Repository.LocalPath)/build' 
    customCommand: 'run build validate $(Build.Repository.LocalPath) /subscriptions/XXXX/resourceGroups/ivotalkstech-rg/providers/Microsoft.DataFactory/factories/ivo-adf-dev'
  displayName: 'Validate'

- task: Npm@1
    command: 'custom'
    workingDir: '$(Build.Repository.LocalPath)/build'
    customCommand: 'run build export $(Build.Repository.LocalPath) /subscriptions/XXXX/resourceGroups/ivotalkstech-rg/providers/Microsoft.DataFactory/factories/ivo-adf-dev "ArmTemplate"'
  displayName: 'Validate and Generate ARM template'

- task: AzurePowerShell@5
  displayName: 'Override ADF Parameters'
    azureSubscription: 'ivo-devops-sp'
    ScriptType: 'FilePath'
    ScriptPath: 'IvoTalksTech/devops_scripts/adf_override_params_all.ps1'
    errorActionPreference: 'continue'
    azurePowerShellVersion: 'LatestVersion'
    workingDirectory: '$(Build.Repository.LocalPath)/build/ArmTemplate'

- task: PublishPipelineArtifact@1
    targetPath: '$(Build.Repository.LocalPath)/build/ArmTemplate' 
    artifact: 'ArmTemplates'
    publishLocation: 'pipeline'

- task: UniversalPackages@0
  displayName: 'Universal publish'
    command: 'publish'
    publishDirectory: '$(Build.ArtifactStagingDirectory)'
    feedsToUsePublish: 'internal'
    vstsFeedPublish: 'XXXXX/XXXXX'
    vstsFeedPackagePublish: 'drop-adf-prod'
    versionOption: 'patch'
    packagePublishDescription: 'ver'
  • The trigger step will start the build pipeline every time a new pull request / push to main (master) branch is created.
  • For pool we’re using a regular Ubuntu OS for the Virtual Machine.
  • For checkout we’re getting the branch that is currently in use and a second repository where the DevOps scripts are located. (This will come clear in the next step)
  • The first task is to install Node.js, this is needed for the auto-validation and auto-publishing tasks.
  • The second task is to validate the objects in the Data Factory. The functionality is the same as “Validate all” in the Data Factory’s User Interface. Please edit the link to the resource as per your addresses.
  • The third task is to generate the ARM template. Please edit the link to the resource as per your addresses.
  • The PowerShell task is to run a script that overrides the ARM template parameters (more on this later).
  • The next task is to publish the artifact in the pipeline.
  • The last task is to publish the artifact in the artifacts feed. Please edit “vstsFeedPublish” to your artifact feed. If you don’t know the link you can click on “Settings” and select it from the dropdown menu. If you don’t have artifact feed, you can create one by going to Artifacts > Create Feed.

Now for the PowerShell task, you can either add it inline or store it in different repo. For easier maintenance we’re storing it in different repo, where the script is overriding the ARM template parameters (i.e. values that need to be changed in accord to the target environment before we deploy). There are several approaches here, one is to use regular expressions, so all values would be changed:

(Get-Content ARMTemplateParametersForFactory.json) | Foreach-Object {
    $_ -replace 'dev', 'prod' 
    } | Set-Content P_ARMTemplateParametersForFactory.json

Other is to specifically edit each value in the template:

$json=Get-Content -Raw "ARMTemplateParametersForFactory.json" | ConvertFrom-Json
$json.parameters.factoryName.value = "ivo-adf-prod"
$json | ConvertTo-Json | Set-Content "P_ARMTemplateParametersForFactory.json"

Or you can also use the combination of the two. The last approach is to not to override the parameters in the build at all, but to edit them in the release pipeline, but I don’t recommend that because it’ll require you to edit the release pipeline each time a new ARM template parameter value needs to be overridden, which is suboptimal from maintenance perspective. But feel free to pick whatever approach works for you the best.

From authorization point the “ivo-devops-sp” is a service connection to a service principal with Contributor RBAC.

You can also rename the build pipeline, because by default the name is the same as the repository.

Type something that makes sense to you and click “Save”.

Setup Continuous Delivery

Go to Pipelines > Releases > New folder, type “ADF” and click “OK”.
Then go to “New release pipeline”.

Select “empty job”:

Change the Pipeline name and the Stage name:

For artifact choose the build pipeline we’ve just created.

And for second artifact choose the Azure repository where you’ll add more PowerShell scripts for the deployment operations.

Enable continuous deployment trigger for the artifact from the build pipeline. This will activate the deployment each time the build pipeline completes.

If needed, we can also add pre-deployment conditions like manual approval step. We can add either specific persons or a whole team.

Now for the repo where we store the DevOps scripts we add the following:

adf_trigger_listing.ps1 – This will get the state of the Data Factory triggers.

    [parameter(Mandatory = $true)] [String] $resourcegroup,
    [parameter(Mandatory = $true)] [String] $datafactory

Get-AzDataFactoryV2Trigger -ResourceGroupName $resourcegroup -DataFactoryName $datafactory | Where-Object {$_.RuntimeState -eq "Started"} | ConvertTo-Json | jq -r ".[].Name"  | Out-File TriggersStarted.txt
Get-AzDataFactoryV2Trigger -ResourceGroupName $resourcegroup -DataFactoryName $datafactory | Where-Object {$_.RuntimeState -eq "Stopped"} | ConvertTo-Json | jq -r ".[].Name"  | Out-File TriggersStopped.txt

echo "Started Triggers:"
foreach ($triggerStart in Get-Content "TriggersStarted.txt") {echo $triggerStart}
echo "Stopped Triggers:"
foreach ($triggerStop in Get-Content "TriggersStopped.txt") {echo $triggerStop}

adf_trigger_toggling.ps1 – This will start the Data Factory triggers.

    [parameter(Mandatory = $true)] [String] $resourcegroup,
    [parameter(Mandatory = $true)] [String] $datafactory

#Start active triggers - after cleanup efforts
Write-Host "Starting active triggers"
foreach ($triggerStart in  Get-Content "TriggersStarted.txt") {
$state = Get-AzDataFactoryV2Trigger -ResourceGroupName $resourcegroup -DataFactoryName $datafactory -TriggerName $triggerStart
    if ($state.Properties -like "*events*") {
        Write-Host "Subscribing" $triggerStart
        $status = Add-AzDataFactoryV2TriggerSubscription -ResourceGroupName $resourcegroup -DataFactoryName $datafactory -Name $triggerStart
        while ($status.Status -ne "Enabled"){
            Start-Sleep -s 15
            $status = Get-AzDataFactoryV2TriggerSubscriptionStatus -ResourceGroupName $resourcegroup -DataFactoryName $datafactory -Name $triggerStart
    Write-Host "Starting" $triggerStart
    Start-AzDataFactoryV2Trigger -ResourceGroupName $resourcegroup -DataFactoryName $datafactory -Name $triggerStart -Force

pre-post-deployment-script.ps1 – This is the script from Microsoft for pre and post deployment operations, it accounts for related resources and resource references. I made a change not to include Integration Runtimes and Triggers, because I don’t want to deploy the state of the trigger, I want to be starting triggers that are only in active state in the target Data Factory. (Original script at:

    [parameter(Mandatory = $false)] [String] $armTemplate,
    [parameter(Mandatory = $false)] [String] $ResourceGroupName,
    [parameter(Mandatory = $false)] [String] $DataFactoryName,
    [parameter(Mandatory = $false)] [Bool] $predeployment=$true,
    [parameter(Mandatory = $false)] [Bool] $deleteDeployment=$false

function getPipelineDependencies {
    param([System.Object] $activity)
    if ($activity.Pipeline) {
        return @($activity.Pipeline.ReferenceName)
    } elseif ($activity.Activities) {
        $result = @()
        $activity.Activities | ForEach-Object{ $result += getPipelineDependencies -activity $_ }
        return $result
    } elseif ($activity.ifFalseActivities -or $activity.ifTrueActivities) {
        $result = @()
        $activity.ifFalseActivities | Where-Object {$_ -ne $null} | ForEach-Object{ $result += getPipelineDependencies -activity $_ }
        $activity.ifTrueActivities | Where-Object {$_ -ne $null} | ForEach-Object{ $result += getPipelineDependencies -activity $_ }
        return $result
    } elseif ($activity.defaultActivities) {
        $result = @()
        $activity.defaultActivities | ForEach-Object{ $result += getPipelineDependencies -activity $_ }
        if ($activity.cases) {
            $activity.cases | ForEach-Object{ $_.activities } | ForEach-Object{$result += getPipelineDependencies -activity $_ }
        return $result
    } else {
        return @()

function pipelineSortUtil {
    [Hashtable] $pipelineNameResourceDict,
    [Hashtable] $visited,
    [System.Collections.Stack] $sortedList)
    if ($visited[$pipeline.Name] -eq $true) {
    $visited[$pipeline.Name] = $true;
    $pipeline.Activities | ForEach-Object{ getPipelineDependencies -activity $_ -pipelineNameResourceDict $pipelineNameResourceDict}  | ForEach-Object{
        pipelineSortUtil -pipeline $pipelineNameResourceDict[$_] -pipelineNameResourceDict $pipelineNameResourceDict -visited $visited -sortedList $sortedList


function Get-SortedPipelines {
        [string] $DataFactoryName,
        [string] $ResourceGroupName
    $pipelines = Get-AzDataFactoryV2Pipeline -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
    $ppDict = @{}
    $visited = @{}
    $stack = new-object System.Collections.Stack
    $pipelines | ForEach-Object{ $ppDict[$_.Name] = $_ }
    $pipelines | ForEach-Object{ pipelineSortUtil -pipeline $_ -pipelineNameResourceDict $ppDict -visited $visited -sortedList $stack }
    $sortedList = new-object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSPipeline]
    while ($stack.Count -gt 0) {

function triggerSortUtil {
    [Hashtable] $triggerNameResourceDict,
    [Hashtable] $visited,
    [System.Collections.Stack] $sortedList)
    if ($visited[$trigger.Name] -eq $true) {
    $visited[$trigger.Name] = $true;
    if ($trigger.Properties.DependsOn) {
        $trigger.Properties.DependsOn | Where-Object {$_ -and $_.ReferenceTrigger} | ForEach-Object{
            triggerSortUtil -trigger $triggerNameResourceDict[$_.ReferenceTrigger.ReferenceName] -triggerNameResourceDict $triggerNameResourceDict -visited $visited -sortedList $sortedList

function Get-SortedTriggers {
        [string] $DataFactoryName,
        [string] $ResourceGroupName
    $triggers = Get-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName
    $triggerDict = @{}
    $visited = @{}
    $stack = new-object System.Collections.Stack
    $triggers | ForEach-Object{ $triggerDict[$_.Name] = $_ }
    $triggers | ForEach-Object{ triggerSortUtil -trigger $_ -triggerNameResourceDict $triggerDict -visited $visited -sortedList $stack }
    $sortedList = new-object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSTrigger]
    while ($stack.Count -gt 0) {

function Get-SortedLinkedServices {
        [string] $DataFactoryName,
        [string] $ResourceGroupName
    $linkedServices = Get-AzDataFactoryV2LinkedService -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName
    $LinkedServiceHasDependencies = @('HDInsightLinkedService', 'HDInsightOnDemandLinkedService', 'AzureBatchLinkedService')
    $Akv = 'AzureKeyVaultLinkedService'
    $HighOrderList = New-Object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSLinkedService]
    $RegularList = New-Object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSLinkedService]
    $AkvList = New-Object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSLinkedService]

    $linkedServices | ForEach-Object {
        if ($_.Properties.GetType().Name -in $LinkedServiceHasDependencies) {
        elseif ($_.Properties.GetType().Name -eq $Akv) {
        else {

    $SortedList = New-Object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSLinkedService]($HighOrderList.Count + $RegularList.Count + $AkvList.Count)

$templateJson = Get-Content $armTemplate | ConvertFrom-Json
$resources = $templateJson.resources

Write-Host "Getting triggers"
$triggersInTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/triggers" }
$triggerNamesInTemplate = $triggersInTemplate | ForEach-Object {$, $}

$triggersDeployed = Get-SortedTriggers -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName

$triggersToStop = $triggersDeployed | Where-Object { $triggerNamesInTemplate -contains $_.Name } | ForEach-Object { 
    New-Object PSObject -Property @{
        Name = $_.Name
        TriggerType = $_.Properties.GetType().Name 
$triggersToDelete = $triggersDeployed | Where-Object { $triggerNamesInTemplate -notcontains $_.Name } | ForEach-Object { 
    New-Object PSObject -Property @{
        Name = $_.Name
        TriggerType = $_.Properties.GetType().Name 
$triggersToStart = $triggersInTemplate | Where-Object { $ -eq "Started" -and ($ -gt 0 -or $ -ne $null)} | ForEach-Object { 
    New-Object PSObject -Property @{
        Name = $, $
        TriggerType = $_.Properties.type

if ($predeployment -eq $true) {
    #Stop all triggers
    Write-Host "Stopping deployed triggers`n"
    $triggersToStop | ForEach-Object {
        if ($_.TriggerType -eq "BlobEventsTrigger") {
            Write-Host "Unsubscribing" $_.Name "from events"
            $status = Remove-AzDataFactoryV2TriggerSubscription -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
            while ($status.Status -ne "Disabled"){
                Start-Sleep -s 15
                $status = Get-AzDataFactoryV2TriggerSubscriptionStatus -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
        Write-Host "Stopping trigger" $_.Name
        Stop-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name -Force
else {
    #Deleted resources
    Write-Host "Getting pipelines"
    $pipelinesADF = Get-SortedPipelines -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
    $pipelinesTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/pipelines" }
    $pipelinesNames = $pipelinesTemplate | ForEach-Object {$, $}
    $deletedpipelines = $pipelinesADF | Where-Object { $pipelinesNames -notcontains $_.Name }
    $dataflowsADF = Get-AzDataFactoryV2DataFlow -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
    $dataflowsTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/dataflows" }
    $dataflowsNames = $dataflowsTemplate | ForEach-Object {$, $ }
    $deleteddataflow = $dataflowsADF | Where-Object { $dataflowsNames -notcontains $_.Name }
    Write-Host "Getting datasets"
    $datasetsADF = Get-AzDataFactoryV2Dataset -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
    $datasetsTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/datasets" }
    $datasetsNames = $datasetsTemplate | ForEach-Object {$, $ }
    $deleteddataset = $datasetsADF | Where-Object { $datasetsNames -notcontains $_.Name }
    Write-Host "Getting linked services"
    $linkedservicesADF = Get-SortedLinkedServices -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
    $linkedservicesTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/linkedservices" }
    $linkedservicesNames = $linkedservicesTemplate | ForEach-Object {$, $}
    $deletedlinkedservices = $linkedservicesADF | Where-Object { $linkedservicesNames -notcontains $_.Name }
    #Write-Host "Getting integration runtimes"
    #$integrationruntimesADF = Get-AzDataFactoryV2IntegrationRuntime -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
    #$integrationruntimesTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/integrationruntimes" }
    #$integrationruntimesNames = $integrationruntimesTemplate | ForEach-Object {$, $}
    #$deletedintegrationruntimes = $integrationruntimesADF | Where-Object { $integrationruntimesNames -notcontains $_.Name }

    #Delete resources
    Write-Host "Deleting triggers"
    $triggersToDelete | ForEach-Object { 
        Write-Host "Deleting trigger "  $_.Name
        $trig = Get-AzDataFactoryV2Trigger -name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName
        if ($trig.RuntimeState -eq "Started") {
            if ($_.TriggerType -eq "BlobEventsTrigger") {
                Write-Host "Unsubscribing trigger" $_.Name "from events"
                $status = Remove-AzDataFactoryV2TriggerSubscription -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
                while ($status.Status -ne "Disabled"){
                    Start-Sleep -s 15
                    $status = Get-AzDataFactoryV2TriggerSubscriptionStatus -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
            Stop-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name -Force 
        Remove-AzDataFactoryV2Trigger -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force 
    Write-Host "Deleting pipelines"
    $deletedpipelines | ForEach-Object { 
        Write-Host "Deleting pipeline " $_.Name
        Remove-AzDataFactoryV2Pipeline -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force 
    Write-Host "Deleting dataflows"
    $deleteddataflow | ForEach-Object { 
        Write-Host "Deleting dataflow " $_.Name
        Remove-AzDataFactoryV2DataFlow -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force 
    Write-Host "Deleting datasets"
    $deleteddataset | ForEach-Object { 
        Write-Host "Deleting dataset " $_.Name
        Remove-AzDataFactoryV2Dataset -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force 
    Write-Host "Deleting linked services"
    $deletedlinkedservices | ForEach-Object { 
        Write-Host "Deleting Linked Service " $_.Name
        Remove-AzDataFactoryV2LinkedService -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force 
    #Write-Host "Deleting integration runtimes"
    #$deletedintegrationruntimes | ForEach-Object { 
    #    Write-Host "Deleting integration runtime " $_.Name
    #    Remove-AzDataFactoryV2IntegrationRuntime -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force 

    if ($deleteDeployment -eq $true) {
        Write-Host "Deleting ARM deployment ... under resource group: " $ResourceGroupName
        $deployments = Get-AzResourceGroupDeployment -ResourceGroupName $ResourceGroupName
        $deploymentsToConsider = $deployments | Where { $_.DeploymentName -like "ArmTemplate_master*" -or $_.DeploymentName -like "ArmTemplateForFactory*" } | Sort-Object -Property Timestamp -Descending
        $deploymentName = $deploymentsToConsider[0].DeploymentName

       Write-Host "Deployment to be deleted: " $deploymentName
        $deploymentOperations = Get-AzResourceGroupDeploymentOperation -DeploymentName $deploymentName -ResourceGroupName $ResourceGroupName
        $deploymentsToDelete = $deploymentOperations | Where { $ -like "*Microsoft.Resources/deployments*" }

        $deploymentsToDelete | ForEach-Object { 
            Write-host "Deleting inner deployment: " $
            Remove-AzResourceGroupDeployment -Id $
        Write-Host "Deleting deployment: " $deploymentName
        Remove-AzResourceGroupDeployment -ResourceGroupName $ResourceGroupName -Name $deploymentName

    ##Start active triggers - after cleanup efforts
    #Write-Host "Starting active triggers"
    #$triggersToStart | ForEach-Object { 
    #    if ($_.TriggerType -eq "BlobEventsTrigger") {
    #        Write-Host "Subscribing" $_.Name "to events"
    #        $status = Add-AzDataFactoryV2TriggerSubscription -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
    #        while ($status.Status -ne "Enabled"){
    #            Start-Sleep -s 15
    #            $status = Get-AzDataFactoryV2TriggerSubscriptionStatus -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
    #        }
    #    }
    #    Write-Host "Starting trigger" $_.Name
    #    Start-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name -Force

We’re doing this because we can’t deploy Data Factory if there are active triggers. Now let’s start adding tasks to the job. First, we want to get the trigger states. We can add Azure PowerShell task and for Script path add the path to adf_trigger_listing.ps1 and for Script argument add -resourcegroup and -datafactory for the target factory. For “Azure subscription” use the same type of service connection as we did in the build pipeline (i.e. service principal with Contributor RBAC) and use this for all subsequent tasks.

Next, add another Azure PowerShell task that will point to “pre-post-deployment-script.ps1” with script arguments for the target Data Factory and “-predeployment $false -deleteDeployment $true”.

After that we can add the deployment task. Search for “ARM template deployment”

And here select “Create or update resource group” choose your resource group and location. For Template add the location to the template (e.g. $(System.DefaultWorkingDirectory)/_CI_ADF_ITT/ArmTemplates/ARMTemplateForFactory.json) for Template parameters the location to the template parameters we did in the build pipeline (e.g. $(System.DefaultWorkingDirectory)/_CI_ADF_ITT/ArmTemplates/P_ARMTemplateParametersForFactory.json) and for Deployment mode make sure it’s “Incremental”

Next, add another Azure PowerShell task that will point to “pre-post-deployment-script.ps1” with script arguments for the target Data Factory and “-predeployment $true -deleteDeployment $false”.

Finally, we must add Azure PowerShell task to start back the triggers, which must point to “adf_trigger_toggling.ps1”

If you have pre-production environment, you can reproduce the steps for it.

CI/CD test run

Let’s start by creating new branch. You can either login to Azure DevOps. (Go to your project’s repo, click on “New branch” add name and click “create”) or you can do the same via Azure Data Factory studio.

Then for the test let’s just add some objects to the branch, like adding a new pipeline.

Then create and complete pull request from the branch to master branch.

This will automatically start the build pipeline (Continuous Integration part)

You can check the pipeline by clicking on it.

Clicking on the published artifact we can see all the generated files.

Clicking on the job we can see all steps. We can clearly see that everything completed successfully and fast.

On the artifact feed we are also storing a link to the pipeline.

After the build pipeline completes (the continuous integration part), the release pipeline starts (the continuous delivery part). We’ve added manual pre-deployment condition to approve.

Clicking on the pipeline will show the whole deployment process and we can see that everything passed successfully.

Going to Azure Data Factory studio we can also see that the new object is there.

Enjoy the CI/CD setup!

Stay Awesome,

Leave a Reply

Your email address will not be published.