Here's a quote from some programming book:
Every time your code references global data, it ties itself to the other components that share that data. Even globals that you intend only to read can lead to trouble (for example, if you suddenly need to change your code to be multithreaded). etc etc...
Now, please take a look at the reasoning of pages 41 and 42 of this paper:
https://github.com/Alex-Linhares/FARGonautica/blob/master/Literature/Chess-Capyblanca-2014-Linhares-Information%20Sciences.pdf
Why are we using Temperature as the basis for decision-making? Even as temperature is a wonderful metaphor, with many historical reasons for its adoptance, can't we try to move to entropy? That is, can't we try out entropy measures based on the unhappiness and the relevance of particular structures, instead of having a big-brotheresque variable T that everyone has to check in order to make a decision.
Why would it be interesting?
- Because the system becomes more distributed and more psychologically (and neuroscientifically) plausible.
- Because in cases such as the "abc:abd xyz:?" snag.... as T goes up, the entire construction is up for grabs and reconfiguration... perhaps some structure can, and should, remain intact, and only the very unhappy objects should have a high probability of being destroyed.
- It preserves parallel terraced scan and it preserves the thermodynamics metaphor (the brain is, after all, using energy).