First WhatsNewHelpConceptInfoGlossaryHomeContentsGalleryThemesOur PapersSearchAction !
BackNext TourbusIntroductionTutorLinksApplicatOnlineRelatedOfflineSoftwareExhibitionFun

Self-Organization & Entropy - The Terrible Twins

by Chris Lucas

"Order is not pressure which is imposed on society from without,
but an equilibrium which is set up from within."

José Ortega y Gasset, Mirabeau and Politics, 1927

"Freedom and constraint are two aspects of the same necessity,
which is to be what one is and no other."

Antoine de Saint-Exupéry, La Citadelle, 1948, ch. 43

Introduction

Many people will have heard of the Second Law of Thermodynamics. That's the one that states that the Universe is forever running down towards a "Heat Death". It is based on the concept of Entropy. This has several definitions - the inability of a system to do work; a measure of the disorder in a system and the one most often used nowadays - the tendency of a system to enter a more probable state, usually described as being to create chaos from order. Here we will look at the opposite idea, that order and not chaos is the most probable state.

Probable States

So, which states are probable exactly ? Well to give an example, suppose we have a pack of cards and shuffle them, are we then likely to deal a sequence of four cards that are all aces ? No, in fact the theory of gambling is based on the idea that a shuffle will randomise the pack and the cards dealt will be in no special order. Four aces are said to be so improbable that they would be expected to occur by chance only once in about 270 thousand such deals. Order thus has a low probability, any change to a system (such as a shuffling) will be expected to reduce its order significantly.

Order in Context

But what exactly is this order ? Essentially it is whatever we say it is ! A regularity, conforming to some human definition, 4 of a kind in this case (the kind could be anything - value, colour, size, age, etc.). Our classification of the world imposes order in various forms, depending upon our viewpoint. We largely see what we want to see... This means that order is contextual, it depends upon the environment in which it occurs, essentially being an interaction between an observer and the system, a correlation between object and subject.

Entropy and Gasses

Does this mean that Entropy is a meaningless concept ? Not quite, as a measure of a change it has great value in science, but is often misused as the only measure of a dynamic system. One originator of the idea, Ludwig Boltzmann, based his work on the theory of gasses, in which all the molecules can move randomly (a form of shuffle). In those circumstances the system can be proved to run down, any original order will dissipate over time until the system is homogeneous and in equilibrium, the state of maximum disorder and unchanging evenness.

Beyond Ideal Gasses

Does this then apply to solids also ? To see that (in one sense) it does not, let us look at an iron ball bearing in a box. Do the iron atoms expand to fill the box evenly ? Certainly not, yet a Boltzmann style gas would do so rapidly. What is the difference here ? Bounded Planet Traditional Entropy, by assuming that ideal gas properties always apply, ignores many of the other constraints (or boundary conditions) that apply to real systems. In this case the attractions between the atoms are much stronger than the random (thermal motion) forces pushing them apart - the ball bearing retains its shape as a solid. Similarly, for man made objects, the many constraints inherent in their fabrication and assembly impose limits to their degrees of freedom (mechanically, electrically, chemically, thermally etc.) with the result that the freedom to move of their parts is largely abolished. It is the specific boundary conditions imposed on the system that restricts the state space of the constituents, and thus compels the organization that results .

For liquids we have an intermediate case, the weaker attractions here allow for some motion, but the atoms when moving drag neighbouring atoms along - the liquid flows. This brings us to an interesting feature of these three states of matter. For gases the motion of the molecules is chaotic (this follows from simple gravitational analysis), for solids essentially we have a static system (atoms still vibrate chaotically, but the large scale structure is fixed and determined). Liquids are a special case, and can be regarded as collections of molecules whose interaction regime changes as they move about. There is a combination of small scale order (local attractions) and large scale disorder (uncorrelated over distance), the patterns that result (for example whirlpools) are emergent and not contained within the laws of electromagnetic interaction applicable to the chemistry.

Complexity of Information

Order can also be regarded as information, so we can classify the complexity of a system by how much information we need to describe it. If we do this we find that both solids and gasses have low complexity (simple descriptions) yet to fully describe a whirlpool would need a very extensive description, forever changing with time - liquids have a potentially high information content. Local interactions of liquid molecules give a dynamic structure to the liquid which can cause the emergence of unexpected features. These features are not predicted by traditional entropy considerations, they are too improbable...

This discrepancy is perhaps best explained by noting that it is usual in equilibrium systems work to simplify the terms and use only what is better known as the 'conditional entropy'. Yet entropy overall is conserved, and to complete the picture we need to add in the 'entropy of correlation' which relates to the information known about the system by the observer. As a system 'runs down' and becomes more disorganised the knowledge held by the observer decreases, hence the conditional entropy increases (as tradition dictates), yet in self-organizing systems this 'run-down' does not happen, so we can have either a static entropy or an decreasing one. When that occurs, then the complex state is the probable one and no discrepancy exists. In essence this is an empirical question, not a theoretical difficulty.

Self-Organization

But where have we seen self-organization before ? Well, in the field of Artificial Life, which studies those emergent features that result from the interactions of multiple agents following their own local laws. So, what determines which emergent properties occur ? That is precisely the question we are trying to answer. The phenomenon of emergence we could call Extropy, the tendency of systems to create order from chaos - the opposite of Entropy. Generally this term isn't used, instead Self-Organization is the general term employed, with other terms like Autopoiesis and Homeokinetics used in some contexts. Bird Flocking

Is this phenomenon widespread ? Yes, it certainly is, stretching from the organisation of galactic superclusters, via planets, all forms of life (e.g. bird flocking), to inorganic chemistry and perhaps even atomic structure. Complexity Theory searches for the laws that apply at all scales, the inherent constraints on visible order.

Laws of Organization

Do such laws actually exist ? Well, if the 2nd Law (as usually outlined) is to be believed, then there should be no order at all, any order of the type with which we are familiar is far too improbable to have ever come into being by chance, even over the entire age of the universe. A totally disordered system, as implied by the big bang, cannot create order except randomly (quantum fluctuation is usually invoked), yet the tendency is then for it immediately to disintegrate again ! Nethertheless as far as we can see the Universe has persistent order at all scales - and possibly that order is increasing rather than decreasing, at least from our own viewpoint. There is currently a law relating matter and energy (Einstein's famous E=mc2), yet information is also fundamental in the Universe - so we seem to need a law incorporating all three.

This 4th Law (as it is sometimes called) would add creativity to the destruction of the 2nd, balancing the symmetry. Given any probability of new combinations of parts (e.g. in random chemical reactions), we can say that there will be a constant drift from a zero presence of these combinations in the system to a non-zero one. This is an innovative drive, which will continue until an equilibrium state is reached (if ever). Such novelty is, in essence, an increase in dimensionality - new (emergent) variables that can then be manipulated to explore (expanded) state space. Note that state space expands continually as these innovative combinations (new building blocks) occur, thus maximum entropy also expands. If an existing form persists, then this implies a corresponding increase in self-organization, i.e. a lowering of local system entropy.

As the number of variables increases our observation of the system necessarily becomes more selective, less knowledgable. Shared information is exchanging knowledge of such variable states between agents, so we can perhaps reformulate entropy in terms of this information exchange, bringing together both sides of the entropy equation and extending it to a multi-agent scenario, rather than the over-simple 'single isolated observer' in the usual formulation.

Far-From-Equilibrium Science

Many discussions about entropy assume near-equilibrium states, yet due to the constant innovation mentioned above we can show that the Universe overall is not close to equilibrium. Non-equilibrium dynamics relates not to steady-state systems (a simplified special case) but to systems undergoing change, systems either on a transient (flow) towards equilibrium or away from it. Which direction the system takes depends on driving forces, strong energy input for example will force the system far away from equilibrium. For such far-from-equilibrium systems, complex behaviours can set in, the stresses on the system become high and, like environmental stress, can cause breakdowns and jumps in behaviour. The system explores all possible ways to reduce the conflict. In fact, this situation is compatible with the 2nd Law, since in such systems (dissipative ones) the gradients encourage the system to self-organize to an ordered state since this actually increases the rate of entropy production and thus stress reduction. It can be shown that the greater the energy flows in such systems, then the greater the order (and information) generated becomes - some of which is employed by living organisms to do work (exergy) in order to create (temporarily) higher-level 'material' structures (the set of chosen states perhaps being those which maximise entropy production - one candidate 4th Law).

Non-Ergodic Searches

The methods available to do this will depend on the flexibility and complexity of the system interconnections. Any system comprising a large number of parts allows a vast range of possible combinations. Within those combinations most will be disordered, yet many forms of order are also possible. For a random system, all of these ordered forms should appear, each with its relevant probability (as expected from an ergodic exploration of state space), but is this what occurs ? Animals should therefore occur equally often with one, two, three, four or more legs (or eyes, or even heads ?). The same should apply to chemical compounds and galactic forms - it should be impossible that the same ordered forms appear constantly to the exclusion of all others, yet that is what we see. It seems clear that largely unknown constraints restrict the valid forms to a narrow subset of those possible (occupying a small region of state space in the jargon). In other words stressed systems follow specific paths through the immense reaches of state space, a directed not ergodic walk.

Specialists may argue that they already understand why each of these behaviours occur (citing natural selection, bonding energy or gravity perhaps) yet these are just local explanations, similar to those in vogue before Newton's time to explain mechanical phenomena - reductionist and specific. The search is now on for the general laws that are applicable at all scales and allow prediction of form on a macro scale - something not currently possible. Research in Complexity and ALife or Boolean Networks, by using carefully controlled experiments (with well understood local interactions) allows us to probe the vastness of state space and gain a better understanding of the likely structure of these unknown laws.

Dissipative Systems

Most research in the sciences assumes that order requires what are called dissipative systems, that means that energy must be expended (wasted) to create the visible order or information from the chaos. This assumption then leaves intact the Second Law of Thermodynamics. Yet we also claim that energy is conserved (the First Law), thus the energy used to create the order still exists in the Universe. Whether this wasted energy can ever be made 'useful' again is I think still an open question, despite conventional rejection of the idea (which ignores that fact that these two laws are ontologically rather incompatible, the first assumes a static universe, the second a dynamic one). There are some indications that organization itself functions by concentrating energy, by lowering barriers, and of course technology does the same - transforming low frequency, low energy power to high frequency, high energy power (albeit with some losses). If this becomes universally possible (somehow) in the future then the "heat death" (like the "big bang") may yet perhaps prove to be just another figment of man's inadequate imagination and tendency to dogma...

First WhatsNewHelpConceptInfoGlossaryHomeContentsGalleryThemesOur PapersSearchAction !
BackNext TourbusIntroductionTutorLinksApplicatOnlineRelatedOfflineSoftwareExhibitionFun
Page Version 4.83 January 2004 (Paper V1.6 November 2003, original March 1997)