There a variety of methods that can be used to create an instance of an Azure Data Factory, this blog explores how you can create an ADF instance using the Azure Portal, PowerShell and Azure Resource Manager templates.
The first step is to create the Azure Data Factory. this can be performed within the Azure portal. Click on the New icon, point to Databases and then click Data Factory.
At this point the provisioning blade for ADF appears. It is in this blade that you define a name for the Data Factory instance. You then assign the instance to a subscription that you own.
The resource group enables you to define whether your ADF instance will reside in a Resource Group that already exists, or in a resource Group that you create. Resource groups are important in that they contain services that can have the billing to be visible on that container, you can define access control on a resource group, and resources held within that container to be able to communicate with each other without the need to write complex IaaS scripts to further define the communication between services.
The resource group can include all the resources for the solution, or only those resources that you want to manage as a group. You decide how you want to allocate resources to resource groups based on what makes the most sense for your organization. Therefore it is important to understand the objective for creating a resource group and plan them appropriately.
Finally, you will then assign the Data Factory instance to a region of your choice.
You also have the ability to deploy Azure Data Factory instances using PowerShell. This requires that Azure PowerShell is installed on your computer. Once this is set up you can use the following PowerShell commands to create a Data Factory instance. The following commands set the Azure context to a subscription and then defines a Data Factory instance defining the resource group where the instance is hosted, followed by the name and the location.
Azure Resource Manager templates
There is an excellent article from the product group that will enable you to use ARM templates in a variety of Data Factory scenarios. I would highly recommend that you read this.
With the Data Factory instance created, it is then time to create the relevant linked services, data sets and pipelines to perform the data orchestration activities. These will be covered off individually in other blog posts.