Dynamic Pipelines in Azure Data Factory

This article is focused on creating dynamic data factory pipelines, by parameterize the static information added to the pipelines. Since there are multiple environments such as Development, Staging & Production. It becomes very difficult to migrate between the environments, if the pipelines are statically designed. So below is the process to make the data pipelines dynamic.

Note: You must have an Azure Data factory instance & an Azure SQL Database resources created in your resource group, in order to follow with the article.

Step 1: Firstly, Create Configuration table in the Azure SQL Database.

  • Create a new table in the Database.

e.g.: “dbo.configuration_dbtable” with 2 columns “config_key” & “config_value”.

  • Insert the data for Config_key and Config_value in the database.

Configuration table in Azure SQL DB

Step 2: Create a new Pipeline.

  • Name of Pipeline: Config_test_pipeline

  • Add Lookup activity to blank workspace, Give Lookup activity a valid name e.g. — Lookup_config_table

  • Configure Linked Service & Dataset for Lookup activity.

  • Select the Table name — “dbo.configuration_dbtable”, created in step 1.

Configuring Lookup Activity

Step 3: Create Filters for Blob Container & Dimension1 Source & connect the Lookup activity to the filters. Also add the Copy activity, configure source & sink datasets for connecting to Blob and dimension1 sources.

Each filter has 2 settings option, where the first option Items — includes the config table reference that is created in SQL DB, whereas the second option includes the condition in which filter value is referenced as shown below.