Something big happened this weekend. Everyone is talking about it, wondering what the implications will be.

That's right: I finished writing my new book "What is Entropy?" It's just 120 pages long. It has lots of short sections, mostly one page each, each based on a tweet. This is just a draft, and I'm still fixing lots of typos and other mistakes. So grab a copy - and if you catch errors, please let me know, either here or on my blog!

It is not a pop book: it's an introduction that assumes you know calculus. But it's about a lot of big, bold concepts, and I try to really get to the bottom of them:

• information
• Shannon entropy and Gibbs entropy
• the principle of maximum entropy
• the Boltzmann distribution
• temperature and coolness
• the relation between entropy, expected energy and temperature
• the equipartition theorem
• the partition function
• the relation between entropy, free energy and expected energy
• the entropy of a classical harmonic oscillator
• the entropy of a classical particle in a box
• the entropy of a classical ideal gas

I learned a lot by trying to explain in words what people often say only in equations.

johncarlosbaez.wordpress.com/2

@johncarlosbaez Very nice read. I got a small revelation from reading the first ~20 pages:
The entropy of the system is isomorphic to it's storage capacity if you decide to use it as a hard disk (that's why both are measured in bits) e.g. A fair coin has 1 bit of entropy <=> the state of a coin can be used to store exactly 1 bit of info.

Follow

@abuseofnotation @johncarlosbaez

That's only true in the limit in which you have lots of such systems: then you can store entropy*count bits of information in them (or in the limit of the system being composed of a sum of large number of "independent" parts). For example, entropy of a pair of coin tosses with two coins that both come up heads 1/9 of the time is slightly larger than 1, and yet there's no way to store a result of a fair coin toss in that system (because the largest probability outcome there has probability ~0.79).

@robryk @johncarlosbaez Thanks, I was overlooking the fact that "storage bits" don't have the concept of uncertainty baked in them.

Now I wonder if there was a way to map the "uncertain bits" to "storage bits" e.g. in TCP/IP when there is more uncertainty some packets are invalid and less info is transferred.

@abuseofnotation @johncarlosbaez

I think you'll want to learn about channels, channel capacity, and noisy coding theorem (en.wikipedia.org/wiki/Noisy-ch, I learned it from inference.org.uk/itprnn/book.p). In short data storage is the same as data transmission (except you transmit it over time). So, the storage process is something that accepts an input and for each possible input value produces some distribution over output values. That's a noisy channel, and one can talk about mutual information[1] between its input and output.

But, you can also think of "storage bits" that have a probability distribution: you can think of ones that _have to_ have something from that distribution assigned to them (so if you want to store something from another distribution in them, you need to map it to this distribution). In that setup:
a) my counterexample from previous reply shows that you aren't necessarily able to store less entropyful variable in more entropyful storage,
b) _if you take the limit of having many independent copies of some storage setup_, the equipartition principle[2] implies that you actually can.

[1] "How much the entropy of the input is reduced on average when we learn what the output is"

[2] Roughly: in the limit of long sequences of i.i.d. variables, then in their joint distribution nearly all of the probability weight comes from outcomes of nearly the same size. It's a basis of many information theory results (including some very counterintuitive ones), but sadly it's hard to weaken the independence requirement.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.