Does Entropy have a unit of measure?


Active Member
I start with a 1-dimensional list of a few random numbers {6, 5, 7, 8, 2, 1, 3, 4} This seems like maximum entropy for a 1D list of numbers.
Then I make a second set, which is a copy of the first set rotated by one 'slot' {5, 7, 8, 2, 1, 3, 4, 6} The relationship between these two is very ordered, and can be shown as a 2D network
Then I can add more chaos to the picture with connections to a third set by using a completely different order {4, 3, 8, 5, 2, 7, 1, 6} and get a 3D network

But exactly how much entropy did I remove and then re-install?


Less is more. Stay pure. Stay poor.
Topology is not my thing, but you likely need some type of loss function or definition of order. My familiarity with entropy is limited to decision trees and the second law of thermodynamics as well as posits on the direction of time. In tree models, they explore different data splits to see if they can get more homogeneity in the two (or more) terminal buckets. So if I change the split do I get more 1s in one bucket and more 0s in the other. Then they recalculate entropy (based more on Shannon information, which I think may be different from Boltzmann's equation, but I may be wrong). However entropy is a construct - so you can likely define order/complexity in multiple ways.

Not sure this helps much.