First WhatsNewHelpConceptInfoGlossaryHomeContentsGalleryThemesOur PapersSearchAction !
BackNext TourbusIntroductionTutorLinksApplicatOnlineRelatedOfflineSoftwareExhibitionFun

Cybernetics and Stochastic Systems

by Chris Lucas

"Compared to the analytical procedure of classical science with resolution into component elements and one-way or linear causality as basic category, the investigation of organized wholes of many variables requires new categories of interaction, transaction, organization, teleology..."

Ludwig von Bertalanffy, 1901-1972

"This new development has unbounded possibilities for good and for evil."

Norbert Wiener, Cybernetics, 1948

Introduction

Many of the concepts used within the complexity sciences derive originally from work done in the mid 20th Century on Cybernetics (Wiener), based on the earlier work on Information Theory (Shannon), and General Systems Theory (von Bertalanffy). These ideas were all attempts to quantify in a rigorous way the treatment of systems as an interdisciplinary science. In other words they were a break from the old views that specialist subjects required specialist ideas. Additionally cybernetics is concerned with the control of systems, the issues of regulation and stability that also face us in the treatment of complex systems. Most recent work on cybernetics has related to mechanical systems (e.g. robots and bionics), but these ideas have always been more general and here we shall restate their relevance to biological and social situations.

Systems

Systems are the lifeblood of complexity thinking and can be defined as a group of interacting parts functioning as a whole and distinguishable from its surroundings by recognizable boundaries. Systems have properties that are emergent, that are not intrinsically found within any of the component parts. They exist only at a higher level of description (e.g. an engine isn't a feature of valves, pistons, or any set of parts in isolation, they have to be suitably inter-connected).

The majority of systems treated in cybernetics are deterministic. This means that the next state of the system is fully specified by the combination of the system inputs, its current states and the transformations or changes allowed - we can use a look-up table to determine the result. This is a Newtonian way of operation and the one most familiar in science and technology generally. But another mode is more common in biological and social systems, and that is the random or stochastic mode. Here the options available are probabilistic, we need to toss dice to determine which options will be chosen (e.g. proteins in a cell meet at random, organisms encounter others randomly, the phone rings at random times). This is an indeterminate, statistical mode of operation.

Cybernetics

Cybernetics is the science of effective organization, of control and communication in animals and machines. It is the art of steersmanship, of regulation and stability. The concern here is with function, not construction, in providing regular and reproducible behaviour in the presence of disturbances. Here the emphasis is on families of solutions, ways of arranging matters that can apply to all forms of systems, whatever the material or design employed. It is the science of the black box, in which the how is irrelevant and only the what matters (similar in one way to behavioural thinking in psychology).

This science concerns the effects of inputs on outputs, but in the sense that the output state is desired to be constant or predictable - we wish the system to maintain an equilibrium state. It is applicable mostly to complex systems and to coupled systems, and uses the concepts of feedback and transformations (mappings from input to output) to effect the desired invariance or stability in the result. Multiple factors can operate here, multiple causes and multiple effects inter-relate in the operation of most cybernetic systems.

Variety and Constraint

We are concerned here with sets of possible options, rather than individual values, with the possible variety of conditions that are encountered by the system and the set of output states that are permitted. This variety can be reduced by constraints, leaving us with less information to deal with, with fewer degrees of freedom, and easier prediction. Lack of independence between variables will also reduce variety since by affecting one variable we simultaneously affect others.

This idea of correlation between parts of the system is crucial in complexity studies, since the systems we study there generally have much wider interconnections than those in artificial systems. Nethertheless many sub-systems in machines do show this interdependence, and it is due to these constraints that we are able to treat parts of the system as independent from each other and design them as separate modules to be assembled into a full machine later (e.g. in aircraft construction).

Information

Information can be defined as the opposite of uncertainty, the more we know the better we are able to act. It is often measured in bits, similar to entropy, using the logarithm (base2) of the number of states known. This figure measures the necessary channel capacity needed to transmit the information within a system (whether telephone line or control system, both are equivalent). Given sufficient capacity a perfect transmission can be made, although we may need to resort to encoding and decoding techniques to overcome distortions and uncertainties due to interference or noise.

One important finding in information theory, of relevance to the brain, is that a single channel can contain multiple messages without mutual interference. This is related to the finding in Fourier analysis that any complex signal is made up of a number of simple sine waves, conversely we can add multiple signals and transmit them together (as used in telephony multiplex systems) separating them again at the other end. In brain terms we can see that a single neuron need not only deal with one signal but, given enough bandwidth, can process multiple ideas at any one time. Similarly a signal too complex for one neuron can be distributed amongst many, each dealing with one feature (harmonic) of the signal.

Transitions

The transmission line, in its cybernetic manifestation, takes the form of a transformation, a transition table that takes the input and indicates for each variety of possible input state what the output will be. These are usually deterministic rule tables, with inputs from both the environment and a controller. This table is equivalent to those used in Game Theory to indicate the outcome between two players in a game, here these conflicting inputs are the disturbance of the environment and the counteraction of the controller.

We must be aware that inputs are not necessarily single dimensional sets of information. We can have many different inputs simultaneously, indeed this is very common outside simplified human machines. These vectors (sets of inputs) may give rise also to output vectors, sets of conditions that have to be simultaneously met if we are to have adequate regulation for the system.

Homeostasis

If we wish to maintain a constant output state, then the controller must oppose any changes occasioned by a disturbance. If this is done then we have a stable system, which we call a state of homeostasis or autopoiesis. Regulation in this way is equivalent to blocking the information contained by the disturbance. If no information can get through then the system must be impervious to perturbation, and this is the meaning of stability.

It is found that in order to do this the controller has to have the same effective variety (options available) as that contained in the possible disturbances to the system. In other words, if the input to the system can have 10 states, then the regulator must possess 10 possible states also. This finding is called 'The Law of Requisite Variety' (Ashby).

Control

In order to oppose the input effects we must know what they are. We can measure them directly at the input to the system, calculate their effect and then implement a correction, but this assumes we can delay their effect until we are ready to act. This is rarely possible in real systems as we usually have no advance notice of the impending changes. It also requires that we monitor every possible input or disturbance type - missing just one may lead to failure.

Monitoring the output instead (the required state) is more common, but this has the disadvantage that it must already have altered if we can see a need for correction. Thus we cannot have total regulation, we must allow some deviation due to the delay between disturbance and action. This has implications for socially regulated systems in that totalitarian control is a myth, only by blocking all dissidence (disturbance) can a constant output state be maintained, yet this blocks also the channel by which the information gets to the censor and is thus self-defeating.

Feedback

The control of a system requires getting information from the output back to the input of a system and this is called feedback. This involves the replacement of the open, linear, chain of cause and effect familiar in most science by a circular causality, a closed loop that implies the merging of causes and effects. This feedback comes in two forms. The one generally used in cybernetics is negative feedback, and this acts to oppose the input. For example, if the external temperature rises (the disturbance) the system will activate a cooling system to maintain the refrigerator temperature (the required output).

Positive feedback on the other hand, acts to reinforce the input, it amplifies it, giving a greatly increased output to any input change. Often this is undesirable and is countered by negative feedback measures, leading in complex systems to a mix of feedback influences. However this form does have its uses in ensuring a fast transition between an unwanted state and a desired one. This is seen in evolution where fitnesses operate in this way, success breeds more success. The interactions of these types of feedback lead to self-limiting systems, and often to cycles and oscillations in nature.

Stability

Stability is closely connected with invariance, the idea that there is some property of the system that remains unchanged. For most systems this ability to maintain a certain state is limited, only for a certain range of disturbances can the control mechanism be guaranteed to be effective. Generally this will be over the range of conditions usually encountered, freak conditions may still cause system failure.

This concept reflects the complexity idea of attractors, and the stability of a system is the extent of the basin of attraction present. In other words, any disturbance that takes the system out of the basin will result in instability, the system will move to a new state. This may be another, perhaps unwanted, equilibrium state (e.g. dead) or may be a chaotic state (random oscillation say).

Design

We have seen how information (variety) relates to regulation potential, and we can extend this idea further. The design of any system relates to the information used and this is equivalent to the amount of selection employed. Thus we can say that the regulator selects from the available information states to produce one output. Selection is also familiar from evolution, therefore we can also say that genes cause a selection from a wider range of possibilities creating a specific organism - genes thus form part of a cybernetic control system. Similarly we (or a machine) select from other possibilities when we design any system. This is equivalent to selecting the attractor basin that the system will follow, so the homeostatic state of any system can be said to be a stable attractor.

In very complex systems the search space is vast, so such selection would seem a slow process. Yet we find that by using binary search techniques, equivalent to the schemas (Holland) used in GA studies, that a very fast search is possible - providing the variables are reducible (independent). This is often the case, hence the success of much of science, yet we must question it when we talk of organic systems. In genes and societies, the effect of one part often constrains the effect of another, for example in co-evolutionary systems (and most of nature operates in this form) a choice by one organism restricts the choices by another (e.g. my selfish behaviour will remove from you the option of co-operating, in any of the thousands of ways we might choose to do so). Thus fitness space is canalized, we create constraints which prevent free choice. These constraints do have the gratuitous feature that they make prediction easier, reducing the control data needed, as seen earlier.

Markov Chains

So far we have looked at deterministic systems, where every option was either chosen or not. Now we can move on to a more realistic mode, and this is where each option has a probability of being chosen (e.g. a coin toss has a 50% probability of being heads and a 50% chance of being tails). These systems are generally avoided by human designers (as they are less predictable and slower to operate) but are ubiquitous in nature and society.

They are exemplified by what are called Markov Chains. Here the transition table or transformation is made up of a matrix of probabilities, therefore the trajectory of the system no longer follows one determinate path towards the attractor but can take one of many, reversing direction or going sideways as it changes. Thus the time to settle to a stable equilibrium state is longer and more uncertain. The main feature of these systems is that the probabilities are fixed, so that over a long time (or over multiple instances) the behaviour of the system can be analysed and predicted statistically. We can see such things for example in the proportion of males and females in the population - although we can't determine the sex of any child at conception, we can predict that about 50% of the total children will be female.

Evolving Probabilities

Going a step beyond such fixed probabilities takes us further into the realm of stochastic systems. Here we can apply chance to the structure of the transition table itself, entries can be added, changed or removed. This is the mutation familiar from evolution and means that the attractors available will change with time. This can be over the long term (species) but is more familiar to us in short term effects (immune system, learning, new technology). Here we generally do not have enough time to determine what the actual probabilities are, the system information is too complex and ever changing for us to instigate a controlled regulatory regime.

These are multiple parameter systems, and because of the co-evolutionary nature of complex systems we need to consider competing regulators. This is the idea that the state required by you and the state required by me are different and incompatible, thus I try to regulate the system towards my end and you try to take my control input into account in compensation to force the system your way. Generalising we can see that here the control inputs of one person form the disturbance inputs of the other and vice versa, and in the limit our input is the total sum of all the effects of every other influence on the planet. Thus, by the Law of Requisite Variety, we cannot oppose the action of the disturbance, we cannot prevent change in our world. What we can do, perhaps, is to stabilise selected aspects of the system and to work together so that we do not waste time and control capacity in trying to counteract each others actions, at the expense of more important matters.

Non-Equilibrium Systems

We are considering (in all cybernetics) dissipative systems, that is systems that take in energy to maintain their homeostatic position, they are in a higher level equilibrium state (rather like a permanently excited atom). Most of the systems we have so far considered have had attractors, states that the system achieves, resting points in state space. We have assumed that the change between one stable state and the next is instantaneous (e.g. a thermostat is either heating or cooling). Yet this is only valid in some instances, in many cases the transition period between states is significant (e.g. metamorphosis in insects takes time). This period, called a transient, is a non-equilibrium state (equilibrium here refers just to a constant state, not only to the lowest energy state familiar from physics).

These transient states can be interrupted before they reach the attractor, they can be nudged into a different transient. In cybernetic terms the disturbance changes again before the regulator can compensate for the last one. The output is thus unstable, it tries to maintain a fixed state but can oscillate around the desired value. These oscillations can escalate (e.g. if the disturbance reverses, so the regulator is then temporarily providing positive feedback) and perhaps we will then see the traditional power law distribution of error that characterises self-organizing systems.

Organic Machines

Large systems increase variety (they have more parts) therefore greater control complexity is necessary for regulation. Simplistic (one-dimensional) control just cannot then work due to the Law of Requisite Variety, and this is a cybernetic finding that has yet to filter through to those people trying to control human systems by using single variables (e.g. in economics, politics, sociology). We can achieve a partial regulation only (unless all variables are correlated and thus the real variation proves less than the channel capacity of our control system), so we must decide which disturbances we are to tackle and which are not important. In biological systems if the wrong choice is made then the organism dies (e.g. playing and never eating is a bad choice) and this danger also affects our social systems, concentrating on trivia and ignoring what is important may result in the death of society also.

Society contains a vast amount of information, thus a large number of possible alternative states. These exceed by far the limited states we build into our inorganic deterministic machines, so if we are to use cybernetic techniques then we need to understand that our controller channel capacity is very much less that would be needed for stability in all aspects. We also are in the realm of probabilistic control and indeterminate operation since biological and social applications are irreducible, the parts interrelate in such a way that attempts to control one variable will have unforeseen effects on many others.

Future Developments

So far the study of such co-evolutionary systems has concentrated on simple cases. Before the advent of cheap computers it was physically impossible to take on the study of true complexity. That era is now at an end and we are creating a world richer in information than any previously seen. We need to understand the behaviour of such systems, we need to be able to control them in such a way that we preserve what we think valuable from the past yet do not prevent the values of the future from coming into being. What these will be we cannot yet say, but few of us would be happy without the advances in our standard of living created in the last 50 years.

The replacement of deterministic cybernetic methods by statistical, multidimensional ones will reflect the emphasis in the new complexity thinking on co-operation and not control. This more organic methodology is obviously a successful one, life has existed on Earth for some four billion years. All forms of life are stochastic, all are homeostatic. They have mastered the technique of using a little information (in the genes) to achieve a lot (Homo Sapiens). Once we too master this technique, we will be in a position to take that step into effective control of our own destinies that so far has eluded us with our bureaucratic obsession with total control. Nature does not control, it is opportunist, escaping through the cracks in the system, and if we are honest we find that human progress has always operated in exactly the same way...

Conclusion

We have seen the equivalence of information, regulation and selection in the enabling of communication, homeostasis and design. The same theoretical considerations and mathematical findings apply to each. The contrast between simple deterministic and complex stochastic systems should not blind us to their fundamental equivalence, they operate in the same way. What we do need to realise is that full control of complex systems is not possible, we can only achieve a partial regulation. In terms of humankind this takes the form of knowledge. Our understanding, our mental abilities, reflects the building of a cybernetic control system in our minds. We internalise the world outside us, we build a regulator in a form that enables us to maintain a stable social state in the face of multiple disturbances. Our cells and bodies have in the distant past achieved the same, each level of our reality is in essence a cybernetic control system of great subtlety. In these systems we rarely compensate directly for a disturbance (resisting an approaching truck is hard !) but use indirect techniques to change the entire scenario (we step back instead). We employ, in other words, other dimensions.

Those wishing to pursue cybernetic ideas more deeply are referred to W. Ross Ashby's excellent book 'Introduction to Cybernetics' plus the other resources in Principia Cybernetica's Electronic Library.

First WhatsNewHelpConceptInfoGlossaryHomeContentsGalleryThemesOur PapersSearchAction !
BackNext TourbusIntroductionTutorLinksApplicatOnlineRelatedOfflineSoftwareExhibitionFun
Page Version 4.83 March 2004 (paper V1.0 June 1999)