This project discusses various ways to develop a mathematical background of entropy in statistical mechanics, information theory, and thermodynamics. Rudolf Clausius originally coined the term in 1865 along with a mathematical framework for entropy concerning thermodynamic systems and energy loss. In 1877, Ludwig Boltzmann created an evaluation of entropy with regard to statistical mechanics and macrostates of the internal microstates of thermodynamic systems. It was not until 1948 that Claude Shannon used information entropy in an attempt to quantify the statistical nature of information transfer. We will develop Entropy in each of these instances and discuss an application to the Kraft inequality.