It Took Me 10 Years to Understand Entropy, Here is What I Learned.

<p>Entropy was originally introduced by Clausius in the early 1850s in order to describe energy loss in&nbsp;<strong>irreversible processes</strong>, which turned very useful to<strong>&nbsp;</strong>predict the spontaneous evolution of systems (e.g. chemical reactions, phase transitions, etc). But at that time, this was more like an abstract math artifact and there was a lack of formalism that could explain what entropy&nbsp;<em>fundamentally</em>&nbsp;represents. It&rsquo;s in 1877 that Boltzmann, founder of statistical thermodynamics, proposed an elegant formalization of entropy. Put simply, he defined entropy&nbsp;<strong><em>S</em></strong>&nbsp;as the measure of the number of possible microscopic arrangements (microstates)&nbsp;<strong>&Omega;</strong>&nbsp;of a system that comply with the macroscopic condition of the system (observed macrostate), e.g., temperature, pressure, energy:</p> <p><a href="https://www.cantorsparadise.com/it-took-me-10-years-to-understand-entropy-here-is-what-i-learned-b2d51e8ccd4c"><strong>Read More</strong></a></p>