Automatic Differentiation with Python and C++ for Deep Learning
<p>This story explores automatic differentiation, a feature of modern Deep Learning frameworks that automatically calculates the parameter gradients during the training loop. The story introduces this technology in conjunction with practical examples using Python and C++.</p>
<p><img alt="" src="https://miro.medium.com/v2/resize:fit:538/1*tNiTg16kPYutjQhjwbdsOw.png" style="height:345px; width:538px" /></p>
<p><strong>Figure 1: </strong>Coding Autodiff in C++ with Eigen</p>
<h1>Roadmap</h1>
<ul>
<li>Automatic Differentiation: what is, the motivation, etc</li>
<li>Automatic Differentiation in Python with TensorFlow</li>
<li>Automatic Differentiation in C++ with Eigen</li>
<li>Conclusion</li>
</ul>
<h1>Automatic Differentiation</h1>
<p>Modern frameworks such as PyTorch or TensorFlow have an enhanced functionality called automatic differentiation [1], or, in short, <strong>autodiff</strong>. As its name suggests, autodiff automatically calculates the derivative of functions, reducing the responsibility of developers to implement those derivatives themselves.</p>
<p><a href="https://pub.towardsai.net/automatic-differentiation-with-python-and-c-for-deep-learning-b78554c4a380"><strong>Click Here</strong></a></p>