The Shannon entropy:
H=–Σp_i log(p_i)
often decreases when you learn something. If you pick a card at random from a normal deck, H is log(52), but if you learn it’s diamonds, H becomes log(13).
But a decrease isn’t guaranteed. I read this example in Peres’s QM textbook (attributed to Uffink):
If your key has a 9/10 chance of being in your pocket and a 1/1000 chance of being in 100 other places, H=0.7856 nats.
If you then check your pocket and find it’s not there, H increases to 4.60517.
@robryk Yes, Peres mentions that.