The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term "e;entropy"e; with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the "e;driving force"e; of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy.It has been 140 years since Clausius coined the term "e;entropy"e;; almost 50 years since Shannon developed the mathematical theory of "e;information"e; - subsequently renamed "e;entropy"e;. In this book, the author advocates replacing "e;entropy"e; by "e;information"e;, a term that has become widely used in many branches of science.The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term "e;entropy"e;.The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur-Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the "e;driving force"e; for which is analyzed in terms of information.