Many fairly random and chaotic looking things are often nothing surprising at all.
https://x.com/keenanisalive/status/1866251675440460234
Many fairly random and chaotic looking things are often nothing surprising at all.
Take any sequence and count unique values, the probability distribution ( as counts) tells you how many unique tokens are needed to store it losslessly. A log function fits for estimation, but so does the tail of a Poisson or normal distribution.
As for surprise, that has to do with how much you have to memorize to not be surprised. The same training data (for machine learning) can be compressed significantly and if you can model relations between unique entities in the data, fairly random looking and chaotic looking things take on meaning, are controllable, and nothing surprising at all.
Temperature is the problem because it does not directly relate to power or energy except where people are very careful and precise. That care and precision is rare on the Internet. If anyone saying entropy want to be precise, they need to show how they measure and define energy and power, spatial and temporal data in their systems. None of this is really hard, it is just tedious and takes great care.