I start with a 1-dimensional list of a few random numbers {6, 5, 7, 8, 2, 1, 3, 4} This seems like maximum entropy for a 1D list of numbers.
Then I make a second set, which is a copy of the first set rotated by one 'slot' {5, 7, 8, 2, 1, 3, 4, 6} The relationship between these two is very ordered, and can be shown as a 2D network
Then I can add more chaos to the picture with connections to a third set by using a completely different order {4, 3, 8, 5, 2, 7, 1, 6} and get a 3D network
But exactly how much entropy did I remove and then re-install?
Then I make a second set, which is a copy of the first set rotated by one 'slot' {5, 7, 8, 2, 1, 3, 4, 6} The relationship between these two is very ordered, and can be shown as a 2D network

Then I can add more chaos to the picture with connections to a third set by using a completely different order {4, 3, 8, 5, 2, 7, 1, 6} and get a 3D network

But exactly how much entropy did I remove and then re-install?