Deploy Azure Data Factory using PowerShell

Azure Data Factory is a cloud based, scalable orchestration service.  Currently I am using Azure Data Factory (ADF) to coordinate and schedule a large scale ETL process.  Building a data factory is a pretty easy process, consisting of various JSON definition files, representing linked services, data sets and pipelines connected together to perform an action.  For simple processes it is not very difficult to create the JSON definition files directly in the Azure Portal.  However, when you are dealing with very large complex processes that you are trying to orchestrate together, it can become a bit unruly to have to manage and deploy all of those files manually.

Visual Studio

Thankfully you can install Azure Data Factory Tools for visual studio.  Installing this extension gives you a data factory project template and the ability to publish directly to the portal.

image

With that being said I have occasionally had issues publishing from Visual Studio, thus the reason for writing this post.  Just like everything else we can use some simple PowerShell scripts to deploy ADF JSON definition files.  These are very basic scripts to push updates to the data factory, they will not validate the JSON script or do any reference validation that the data factory tools will perform.  I mainly use the data set portion of the script because that is where the largest number of files exist.

I still use visual studio and an ADF project to do the authoring of my JSON scripts, the script files are organized into LinkedService, Pipeline and Table folders.

image

One drawback I have with the ADF project template is that the actual file location does not mirror this same folder structure visible in Visual Studio.  When looking in the actual project folder on disk you will see all the JSON scripts mixed together.  With all scripts landing in a single folder I have found that following a verbose naming convention is very helpful.

I use a naming convention as follows:

  • LinkedService_[Type of Service Abbreviation]_[Service Name]
    • For example if I have an Azure Data Lake Storage account called SampleDataLake the file name would be “LinkedService_ADLS_SampleDataLake
  • DataSet_[Type of Data Set]_[DataSet Name]
    • For example if I have an Azure Sql Data Warehouse with an Employee Dimenision table named Employee_D the file name would be “DataSet_ASDW_Employee_D
  • Pipeline_[Description]
    • For example if I had a pipeline which created my employee dimension the file would be named “Pipeline_Employee_Dimension_ETL

As you can see this is a very explicit naming convention to easily convey what that JSON script is meant to do.  This also is very helpful when writing the PowerShell that will loop over specific types of files to deploy.

PowerShell Scripts

The first script is to get your PowerShell session authenticated with the Azure account you would like to deploy your data factory to.

Add-AzureRmAccount

After you are authenticated, you can now create your actual data factory.

New-AzureRmDataFactory -ResourceGroupName "ADF" -Name "SampleFactory" -Location "WestUS"

Once you have created the data factory you can now use that same resource group name and factory name to deploy the rest of the files. Please remember to do them in the following order, Linked Services then Data Sets and last Pipelines.  By following the naming convention above you can see the scripts below taking advantage of some simple filtering on the name of the file to deploy the JSON files based on the type they define.  The following 3 loops beginning with the Linked Services, deploy each file to the newly created data factory from the project folder.

foreach($file in Get-ChildItem "[ProjectFolder]" -filter "LinkedService*")
{
  New-AzureRmDataFactoryLinkedService -ResourceGroupName "ADF" -DataFactoryName "SampleFactory" -Name $file.BaseName -File $file.FullName -Force | Format-List
}
foreach($file in Get-ChildItem "[ProjectFolder]" -filter "DataSet*")
{
 New-AzureRmDataFactoryDataset -ResourceGroupName "ADF" -DataFactoryName "SampleFactory" -Name $file.BaseName -File $file.FullName -Force | Format-List
}
foreach($file in Get-ChildItem "[ProjectFolder]" -filter "Pipeline*")
{
 New-AzureRmDataFactoryPipeline -ResourceGroupName "ADF" -DataFactoryName "SampleFactory" -Name $file.BaseName -File $file.FullName -Force | Format-List
}

Conclusion

Azure Data Factory is a great service for orchestrating large scale jobs.  I would absolutely recommend getting the ADF tooling for visual studio it is definitely a step up from trying to do all the authoring within the Azure Portal.  The goal of this post is not to dissuade the use of the tooling, but to provide a quick solution to deploying the scripts to the Azure Portal.

Resolving Syntax Error Executing U-SQL on Azure Data Lake Analytics from Azure Data Factory

Problem

While working with Azure Data Lake Analytics(ADLA) and Azure Data Factory(ADF) I ran into a strange syntax error when trying to run a DataLakeAnalyticsU-SQL activity in ADF. Before integrating the script with ADF I always test the script first by submitting the job directly to ADLA. From Visual Studio I created a new U-SQL project, wrote the desired script and then submitted the job to ADLA. The script ran correctly, at this point I was ready to run the script with ADF (Run U-SQL script on ADLA from ADF).

The syntax error I received looked something like this:

 image

Solution

Fortunately I have been in contact with some of the ADF and ADL team members, who have been excellent in helping to resolve any issues I run into. Continue reading “Resolving Syntax Error Executing U-SQL on Azure Data Lake Analytics from Azure Data Factory”

Querying Azure SQL Database from an Azure Data Lake Analytics U-SQL Script

Introduction

With the public preview of Azure Data Lake Analytics and U-SQL scripting language becoming available, doing big data processing has become simpler and less daunting for the non-hadooper.  Besides the simplicity, the ability to query external data sources such as Azure SQL Database, Azure SQL Data Warehouse and SQL running on Azure VM, from within a U-SQL script, creates a very compelling case to jump on board.  The focus of this post is my experience with getting that external connection up and running with Azure SQL Database serving as my external source.

Prerequisites

Continue reading “Querying Azure SQL Database from an Azure Data Lake Analytics U-SQL Script”

PowerShell Validation Script for Azure RemoteApp Image

Azure RemoteApp allows you to host windows apps on any device.  I recently had to setup a proof of concept using Azure RemoteApp and an internal client server application.  The actual process is pretty straight forward if you are familiar with the process of capturing an image of an Azure VM or you already have an image you would like to use.

If you are starting this process from scratch make sure you use the Windows Server 2012 Remote Desktop Session Host image, this image is a base image valid for use with Azure RemoteApp.  This image also contains the RemoteApp PowerShell Validation Script that is a required step before Sysprepping the VM and capturing the image.  Also note that this image seems to only be available if you are creating the Azure VM from the Old Azure Portal.

Continue reading “PowerShell Validation Script for Azure RemoteApp Image”