"There is no sense being precise when you don't even know what you're talking about"
John von Neumann
"I have called this principle, by which each slight variation, if useful, is preserved, by the term Natural Selection."
Charles Robert Darwin, The Origin of Species, 1859, ch. 3
In the vast eons of time the processes of evolution have created a common basis for all life on Earth - that of DNA or its simpler variant RNA. Both systems share a method of coding with that greatest of human inventions, the computer, yet the mode of operation of the two processes could hardly be more different.
In our use of computers we first decide an overall goal, we then break down the problem into manageable chunks and write step by step procedures (algorithms) by which to accomplish them. This mirrors our normal approach to life, in which we decide what we wish to do and plan out a way of achieving our ambition, taking into account the resources and time needed. If the plan doesn't work we modify it and try again - a serial, one at a time approach to achievement.
Nature has a different way, brute force and ignorance. If there are a hundred possible options available, well try them all - the first to get there wins. All options are tried simultaneously - a parallel approach to achievement. But where is there ? It just doesn't matter ! At first sight, to human eyes, this is a strange concept - how can you get anywhere useful without knowing where it is ?
"Ignorance is bliss" goes the saying and we humans certainly start from a state of ignorance. Each of us can only plan for what we already know, the constraints of our limited knowledge restrict what we can imagine - we often cannot see beyond our assumptions. To overcome these limitations we must rely on being taught new ideas by someone whom we trust, or by stumbling onto new discoveries by accident - in both ways we expand our mental horizons and thus see new possibilities that were previously unthinkable.
By adopting nature's method and trying every possible option we can arrive at every destination simultaneously. Thus we can enjoy every possible experience. Wow, what a lifestyle ! Is there a snag ? Of course. We consume a hundred times as many resources for every hundred possible options - and the options increase geometrically (like sand grains doubling for every square on a chessboard), the universe just isn't big enough. Note however, that the conformist social equivalent of everybody doing exactly the same thing (all making the same mistake...), uses just as many resources, yet achieves nothing better - despite the considerable duplication waste involved, our planet isn't big enough for this either !
But nature has the solution already, tree pruning. Imagine we have two options, each of those has two more and so on, drawing those out we get a picture like a branching tree. Suppose we impose a test on each option, if the test fails then that option (branch) is deleted (in this illustration we eliminate any mouse genes that fail to find the central food). Once we do this we have pruned the tree of all failed branches and those left passed all our tests (this is similar to how Chess computer programs work). But which test does nature use ? A very simple one - survival itself. Any organisms unable to survive and reproduce will die out, their genes pruned from the available options.
Can computers operate as nature does ? Yes, this is the general field known as Evolutionary Computation, of which the Genetic Algorithm is the basic technique and can be further expanded to remove the need for an underlying program (in the field of Genetic Programming).
How does it work ? Well, we create a population of individuals, each represented by a chromosome (a collection of genes or characteristics) appropriate to the problem we are trying to investigate. Genes are usually implemented by a series of computer bits, enough to cover all available alternative values. The value of each gene is initially assigned randomly, within the available parameters. The population is then evaluated to determine how well each individual does the required task.
The best members of the population (typically the top 10%) then become the parents for the next generation. The genes of the offspring are created by selecting two parents at random and combining part of the chromosome of each, so parent gene combinations ABC mated with DEF may become two offspring ABF and DEC. This mimics the crossover or recombination of sexual reproduction in nature and is repeated for further pairs until a full second population is created. Random changes to individual bits are then made mimicking the natural role of mutation, at some desired rate (say 1 in 100 bits changed).
The resultant population is then evaluated as before, and the process is repeated as many times as desired until the required performance level has been achieved (or no further improvement seems possible). This technique can rapidly covers the space of all possible options and converge on a solution that is beyond the ability of all but the best human programmers, in areas where no conventional solution techniques exist.
In the Genetic Programming extension, the genes are the valid statements within a computer language and the chromosome in this case is itself the program. Additional operations are allowed to join or split program segments, allowing variable length programs to evolve. By GA techniques (excluding mutation) we can thus allow a program to create itself to solve a particular problem, discarding all the failed programs. This has proved a very powerful way of generating solutions to very difficult problems, but with the drawback that the resultant code is all but incomprehensible to humans !
There is a difficulty here. If our test is too specific, then the program might be perfect for that one task, but fail completely for very similar tasks. How can we overcome this ? Again by copying nature. Suppose we wish to create a program to sort a series of numbers. If we have a fixed test sequence of numbers we encounter the problem mentioned, so let us have many sets of numbers, as many as we have candidate programs. Programs can then encounter different tests at each stage and only those capable of a general solution will survive. But we can do better still, let us evolve the tests also by the same process, the test survivors will the ones that defeat a program. Using this coevolution strategy, the tests get harder as the programs get better and this bootstrapping feedback technique gives a considerable improvement in speed and efficiency. If rather than programs and tests we have interacting agents we enter the realm of Artificial Life.
Can we use these techniques in human thought processes ? Perhaps we already do. Whenever we think of a task do we not automatically generate many possible solutions and then evaluate them (subconsciously in most cases) against our goal, discarding all those that fail our criteria ? We then refine the survivors (combining the best parts of each) and so on until we have an optimum solution. Nature has again beaten us to it...
Socially a similar process is common. Brainstorming is a popular management technique for generating diverse pseudo-random solutions to a problem, equivalent to nature's mix and match juggling of genes perhaps. The group then evaluates (tests) the options and prunes those failing some criteria, the winners then propagate to the next stage and perhaps ultimately into production. Along with top down (analysis) and bottom up (synthesis) strategies - both goal directed, we also operate a parallel evolution strategy, companies (alternative chromosomes) compete to grow, ideas (memes - cultural genes) compete for attention. Evolution is embedded in our behaviour !
The technique of Genetic Algorithms is widely applicable, even in art. If we code genes to refer to some feature of an image or music we can evolve extraordinary organically based artforms or music. Any system of building blocks can be readily adapted to these procedures and in this way transcend automatically the imagination of its creator. By constraining the system by the desired criteria (tests) we can allow design processes to evolve as a background task - resulting in quite unexpected innovations. Nature's way.
We must however be clear that we are not fully imitating nature here. Our artificial genes (called G-Types) have well defined correspondences with function or form (called P-Types). We also have only one chromosome of very limited length in our genome, whereas natural genotypes contain many chromosomes, each with thousands of genes.In nature genes translate to body form (phenotype or morphology) via a very complex interactive process, the form is itself an emergent phenomena dependent on the interaction of genes with themselves, with the cell chemistry and with the surrounding environment. We cannot predict this process adequately without understanding the nature of emergence. Genetic engineering, whilst highly useful, is a very dangerous process. The side effects of any changes to a such a complex and highly nonlinear process as life are, to say the least, unpredictable.