How to pass parameters between Data Factory and Databricks
<p>When working with data in Azure, running a Databricks notebook as part of a Data Factory pipeline is a common scenario. There could be various arguments for choosing Databricks as an activity in your Data Factory flow, for example when the default Data Factory activities are not sufficient to meet your data pre-processing or transformation requirements. In this blog I explain how to pass parameters between your Data Factory pipeline and Databricks notebook, so you can easily use variables from your Data Factory pipeline in your Databricks notebooks and vice versa, and integrate these components in your data workflow.</p>
<p><a href="https://medium.com/azure-tutorials/how-to-pass-parameters-between-your-data-factory-pipeline-and-databricks-notebook-e59b4ca6b015"><strong>Visit Now</strong></a></p>