Spark related quickies

<p>The answer is No. It last only for the duration of spark&rsquo;s application run (till the time cluster is up in Databricks). As soon as the cluster ends the scope is lost and you can&rsquo;t access on restart of the cluster.</p> <h2>How to solve above constraint and set a configuration property permanently for all Spark applications run on a cluster?</h2> <p>It is possible by modifying the&nbsp;<em>spark-defaults.conf</em>&nbsp;file, located in the spark configuration directory.</p> <p>When you&rsquo;re working in&nbsp;<em>Databricks</em>, it&rsquo;s easier. It is via cluster configuration, under Advanced settings, and inside the Spark tab. Refer to Figure 1 sequential numbers in red color to check step by step way to do that.</p> <p><a href="https://medium.com/@ashish1772000/spark-related-quickies-56a356e798f"><strong>Visit Now</strong></a></p>
Tags: quickies Spark