Entropy was originally introduced by Clausius in the early 1850s in order to describe energy loss in irreversible processes, which turned very useful to predict the spontaneous evolution of systems (e.g. chemical reactions, phase transitions, etc). But at that time, this was more like an abstract math artifact and there was a lack of formalism that could explain what entropy fundamentally represents. It’s in 1877 that Boltzmann, founder of statistical thermodynamics, proposed an elegant formalization of entropy. Put simply, he defined entropy S as the measure of the number of possible microscopic arrangements (microstates) Ω of a system that comply with the macroscopic condition of the system (observed macrostate), e.g., temperature, pressure, energy:
5 Reasons Why You Need to Understand Skewness in Data Analysis
As a data professional, one of the critical tasks in any analytics project is to explore the data to get a summarized view of…