It Took Me 10 Years to Understand Entropy, Here is What I Learned.
<p>Entropy was originally introduced by Clausius in the early 1850s in order to describe energy loss in <strong>irreversible processes</strong>, which turned very useful to<strong> </strong>predict the spontaneous evolution of systems (e.g. chemical reactions, phase transitions, etc). But at that time, this was more like an abstract math artifact and there was a lack of formalism that could explain what entropy <em>fundamentally</em> represents. It’s in 1877 that Boltzmann, founder of statistical thermodynamics, proposed an elegant formalization of entropy. Put simply, he defined entropy <strong><em>S</em></strong> as the measure of the number of possible microscopic arrangements (microstates) <strong>Ω</strong> of a system that comply with the macroscopic condition of the system (observed macrostate), e.g., temperature, pressure, energy:</p>
<p><a href="https://www.cantorsparadise.com/it-took-me-10-years-to-understand-entropy-here-is-what-i-learned-b2d51e8ccd4c"><strong>Read More</strong></a></p>