Converting Stored Procedures to Databricks
<p>Special thanks to co-author </p>
<p><a href="https://medium.com/u/5a65aefd1fac?source=post_page-----73fe4ab64ed0--------------------------------" rel="noopener" target="_blank">Kyle Hale</a></p>
<p>, Sr. Specialist Solutions Architect at Databricks.</p>
<h1>Introduction</h1>
<p>A stored procedure is an executable set of commands that is recorded in a relational database management system as an object. More generally speaking, it is simply code that can be triggered or executed on a cadence. Once defined the executable routine can be referenced as an object within the host system.</p>
<p>Databricks does not have an explicit “stored procedure” object; however, the concept is fully supported and in a manner that gives engineers more functionality when compared to cloud data warehouses. Between notebooks, JARs, wheels, scripts, and all the power of SQL, Python, R, Scala, and Java, Databricks is well suited for making your stored procedures lakehouse-friendly. These executables should be used as a task within a Databricks job. Jobs can be triggered via external orchestrators, executed using the built in CRON scheduler, or run continuously with Spark Structured Streaming or Delta Live Tables.</p>
<p><a href="https://medium.com/@24chynoweth/converting-stored-procedures-to-databricks-73fe4ab64ed0"><strong>Learn More</strong></a></p>