Export and Import Azure Data Factory

Azure Data Factory is an Azure cloud ETL service for serverless data integrations and orchestrations. This article will describe how to export and import Azure Data Factory between different Azure resource groups or environments.

Pre-requisites
1. Azure subscription with access to deploy Azure resources.
2. Az PowerShell module installed on the local machine.

Export Azure Data Factory

Let’s first export an existing data factory ARM template. Follow the below steps to do it.

  1. Connect to the Azure portal and open-source Azure Data Factory Studio.
  2. Navigate to Manage options.
  3. Click on the Export button under ‘Export ARM Template’ as shown in the above image.
  4. This will download Azure Data Factory code in a file named arm_template.zip. Unzip the file. Files of our interest are
    • ARMTemplateForFactory.json – This file contains the entire Data Factory code.
    • ARMTemplateParametersForFactory.json – This file contains parameters for the Data Factory.
azure data factory download

Transform ARM template

Let’s change the factory parameter values in the ARMTemplateParametersForFactory.json file as per the target environment. The mandatory change you need to make in the parameter file is factoryName. Perhaps, it is also a good idea to set other factory parameters in this file.

adf import export

Note: Include Global parameters in the ARM template is deprecated and only available through git integration. You may need to manually recreate global parameters in the target data factory once the import is done.

Import Azure Data Factory

  1. Open PowerShell ISE and copy the below script in the new window. Ensure you have installed the Az module as mentioned in the prerequisites.
$templateFile = "Full path of ARMTemplateForFactory.json"
$parameterFile="Full path of ARMTemplateParametersForFactory.json"
$deploymentName = "Name of the deployment"
$tenantId = "Tenant id"
$subscriptionName = "Subscription name"
$resourceGroupName = "Name of the resource group"
$factoryName = "target factory name"
$location = "Location of resource group"

#Connect to Azure account
Connect-AzAccount -Tenant $tenantId -Subscription $subscriptionName

#Deploy Data Factory
New-AzResourceGroupDeployment `
  -Name $deploymentName `
  -ResourceGroupName $resourceGroupName `
  -TemplateFile $templateFile `
  -TemplateParameterFile $parameterFile `
  -Tag @{"key1"="value1"; "key2"="value2";}

#Generate managed identity for Data Factory.
Set-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $factoryName -Location $location

2. Provide values to the variables below
$templateFile and $parameterFile variables refer to the files we have downloaded and transformed in the above steps.
$tenantId – Tenant id of the target subscription. You can get tenantId details from here.
$subscriptionName-Subscription name of the target resource group.
$resourceGroupName-Target resource group.
$factoryName – Name of the target Data Factory. Here, we need to ensure that the name is the same one we used in the ARMTemplateParametersForFactory.json file.
$location-Location of the target resource group.

Migrate Azure data factory from one resource group to another.

That’s it! You should see the Data Factory in the target resource group if the above script executes without error.

We have seen how adf import export can be done using PowerShell ISE and Azure Data Factory Studio in Azure portal.

Pro tips:
1. Azure Data Factory export from the Azure portal may not include global parameters in the arm template unless you have git integration enabled.
2. If you want to edit data factory triggers before the deployment, you can edit them in the arm_template.json file under the resources node.
3. If your organization has defined a policy to add tags with Azure resources, you need to include them as a parameter of the New-AzResourceGroupDeployment command.
4. You can refer to this post if you want to enable automatic publishing and deployment of the Data Factory code with Azure DevOps.
5. Learn how to export and import PowerApps package in just a few clicks.

See more

Kunal Rathi

With over 13 years of experience in data engineering and analytics, I've assisted countless clients in gaining valuable insights from their data. As a dedicated supporter of Data, Cloud and DevOps, I'm excited to connect with individuals who share my passion for this field. If my work resonates with you, we can talk and collaborate.

Shopping Cart
Scroll to Top