How to unit test PySpark programs in Databricks notebook?
<p>Unit testing is a software development process in which the smallest testable parts of an application, called units, are individually and independently scrutinized for proper operation.</p>
<p><img alt="" src="https://miro.medium.com/v2/resize:fit:640/1*_9T-LtDnnzKkp82y4jbLtQ.jpeg" style="height:426px; width:640px" /></p>
<p>Photo by <a href="https://www.pexels.com/@startup-stock-photos?utm_content=attributionCopyText&utm_medium=referral&utm_source=pexels" rel="noopener ugc nofollow" target="_blank">Startup Stock Photos</a> from <a href="https://www.pexels.com/photo/man-wearing-black-and-white-stripe-shirt-looking-at-white-printer-papers-on-the-wall-212286/?utm_content=attributionCopyText&utm_medium=referral&utm_source=pexels" rel="noopener ugc nofollow" target="_blank">Pexels</a></p>
<p><a href="https://github.com/microsoft/nutter" rel="noopener ugc nofollow" target="_blank">Nutter framework</a> from Microsoft makes it easy to create unit test cases in Databricks notebook.</p>
<p>It's very simple to use and works directly on notebooks.</p>
<p><strong>Step1</strong>: Install nutter library via pypi</p>
<p><a href="https://ganeshchandrasekaran.com/how-to-unit-test-pyspark-programs-in-databricks-notebook-c148ec060348"><strong>Read More</strong></a></p>