Cloud-based integration solution that enables the creation of data-driven processes for coordinating and automating data transfer and transformation in the cloud.
You may design and plan data-driven processes (called pipelines) that can ingest data from many data sources using Azure data factory.
It may use computing services like HDInsight Hadoop, Spark, Azure Data Lake Analytics, and Azure Machine Learning to process and transform data.
It is the cloud-based ETL and data integration service that allows you to create data-driven workflows for orchestrating data movement and transforming data at scale. Using Azure Data Factory, you can create and schedule data-driven workflows (called pipelines) that can ingest data from disparate data stores
For example you can have a copy command where you specify a source and a destination where that data will be copied to (ex: SQL database).