Vivasoft-logo
[rank_math_breadcrumb]

Azure Data Factory Pipeline Trigger

Azure data factory

Azure Data Factory (ADF) is an extensively managed, serverless data integration solution designed to handle the ingestion, preparation, and transformation of large volumes of data. Within an Azure subscription, it is possible to possess one or multiple instances, referred to as data factories, of Azure Data Factory. Azure Data Factory encompasses several fundamental elements, there one of is Pipeline. In Azure Data Factory and Azure Synapse, an occurrence of executing a pipeline is referred to as a pipeline run.

Pipeline runs are typically initiated by providing arguments to the parameters defined within the pipeline. You can trigger the execution of a pipeline either manually or by utilizing a trigger mechanism. This article furnishes comprehensive information regarding both methods of executing a pipeline. In this instance, we have a list of pipelines obtained from the Azure API, and each name has the capability to be executed. When the “run” button is clicked, the status undergoes a series of changes. Initially, it displays as “InProgress,” then transitions to “Queued,” and finally shows as “Succeeded.” However, there is also a possibility for it to exhibit the “Failed” status. In this example Web MVC application.

Now, let’s discuss the underlying logic used to initiate the pipeline and update its status. Prior to triggering the pipeline, we retrieve all the pipeline names from the Azure API within the controller.

In this case, all the data will be populated within the Pipeline entity to be displayed in the grid.

Following that, when the “Run” button is clicked, the pipeline will be triggered, and a “runId” will be generated. This “runId” is then utilized within the controller to update the status of the pipeline to a new value. In this case, specifically the pipelineId that has been passed as a parameter, to determine the desired pipeline name to execute.

In this above, we have developed the ADFServices class, which serves as a service class encompassing various functions such as GetPipelines and GetToken. Additionally, we have created the Pipeline entity model and PipelineParam models to support the implementation. Now, construct it according to your own preferences or requirements.

We will now create a Demo Web MVC application to demonstrate how to trigger a pipeline.

Configure Demo Web App :

Initially, it is necessary to install a few NuGet packages.

To facilitate better comprehension of the configuration process, we will provide the “Web” <ItemGroup> section within the .csproj file.

Make sure to configure all your pipeline configure credentials in the appsettings.json file. In the appsettings.json file, include the following information:

To retrieve JSON values from appsettings and appsettings. it is necessary to create a class property. Use the following syntax to define the property:

Next, let’s register the application services. Below is the corresponding configuration:

To execute an Azure Pipeline using the Azure Data Factory REST API, we need to generate a token. This can be achieved by using an HTTP request in RestSharp. For this purpose, we will utilize a common utility method which execute async way, and use this right way instead of client.ExecuteAsync(request).

Next, we will generate a service class ADFServices responsible for executing various Azure Data Factory pipeline APIs. Additionally, we will incorporate the PipelineConfigure dependency property into this class by means of dependency injection.

Next, To make requests to the Azure Data Factory services REST API, it is necessary to generate a separate token specifically for authenticating all Azure Pipeline APIs. Let’s create a model for receiving token response.

Now, let’s proceed by following a few steps:

Token generate:

We utilize the RestSharp RestClient library, which enables us to make calls to the Azure Rest API in order to obtain a token using the pipeline credentials.

Response look like:

Retrieve Pipelines:

Once the pipeline token is generated, the next step is to retrieve a list of pipelines. This list will be used to execute each pipeline using the generated token. Let’s create a model for receive response

Now, let’s make a request to the Azure API using RestSharp to retrieve a list of pipelines

Response look like:

Here is “name”: “Get Contacts” is the pipeline name.

Run Pipeline :

To trigger each pipeline, we need to obtain the runId first. To achieve this, we make a request using the following syntax: Let’s create a model for receive response.

Response look like:

Get Pipeline Status :

Subsequently, we trigger the pipeline and retrieve the updated status of the pipeline run. This can be done by making a request using the following syntax:

Response look like:

Retrieves the status of a pipeline run, which can be one of the following: Queued, InProgress, Succeeded, Failed, Canceling, or Cancelled.

Conclusion :

Azure Data Factory pipelines can be scheduled to run at specific intervals or triggered based on events. Pipeline runs provide monitoring and logging capabilities, allowing you to track the progress, performance, and status of your data integration workflows.

By Vivasoft Team
By Vivasoft Team
Hire Exceptional Developers Quickly

Find Your Talent

Get Updated with us Regularly
Share this blog on

Hire a Talented Developer from Vivasoft

Lets discuss about your offshore project. We can Help you by our skillful Global team. You can take a free counciling by taking a schedule

Related Post