The simplest ETL stack in Azure — Data Engineering
<p>Starting a Data Engineering project can be daunting, especially for newcomers. One of the initial challenges is understanding the flow of activities required to begin processing data effectively. This post aims to provide an example of a fundamental ETL process using Excel files, orchestrated with Data Factory, Databricks, and Data Lake.</p>
<p><img alt="" src="https://miro.medium.com/v2/resize:fit:700/1*6kUmwAAEjMD7clBmb3dr2Q.png" style="height:541px; width:700px" /></p>
<p><a href="https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/azure-databricks-modern-analytics-architecture" rel="noopener ugc nofollow" target="_blank">Source</a></p>
<p>I’m not going to explain how to create the resources in Azure, since they’re mostly self explanatory in the Azure Portal, but we are going to see the configuration needed in each resource.</p>
<p><a href="https://medium.com/@jogacolhue/the-most-basic-etl-stack-in-azure-data-engineering-6bf5a7197ee1"><strong>Website</strong></a></p>