How do I access data by using the other 80 dataset types in Data Factory?

For source and sink, the Mapping Data Flow feature now supports Azure SQL Database, Azure SQL Data Warehouse, delimited text files from Azure Blob storage or Azure Data Lake Storage Gen2, and Parquet files from Azure Blob storage or Data Lake Storage Gen2.

Use the Copy action to stage data from any of the other connections, then use the Data Flow activity to transform the data after it’s staged. For example, your pipeline may copy data into Blob storage first, then transform it with a Data Flow activity that uses a dataset from the source.