Spark related quickies
<p>The answer is No. It last only for the duration of spark’s application run (till the time cluster is up in Databricks). As soon as the cluster ends the scope is lost and you can’t access on restart of the cluster.</p>
<h2>How to solve above constraint and set a configuration property permanently for all Spark applications run on a cluster?</h2>
<p>It is possible by modifying the <em>spark-defaults.conf</em> file, located in the spark configuration directory.</p>
<p>When you’re working in <em>Databricks</em>, it’s easier. It is via cluster configuration, under Advanced settings, and inside the Spark tab. Refer to Figure 1 sequential numbers in red color to check step by step way to do that.</p>
<p><a href="https://medium.com/@ashish1772000/spark-related-quickies-56a356e798f"><strong>Visit Now</strong></a></p>