"Our impression that life evolves towards greater complexity is probably only a bias inspired by parochial focus on ourselves." (Stephen Jay Gould) 

The modern awakening of interest in complexity as a science began in Vienna in 1928, with Ludwig von Bertalanffy's largely descriptive graduate thesis on living organisms as systems. A few years earlier Alfred North Whitehead had described his similar vision of a "philosopy of organism" in Science and the Modern World. Whitehead describes his theory of the organic conception of nature as based on "self-knowledge of our bodily event." This total bodily event is on the same level as all other events, except for an unusual complexity and stability of inherent pattern. (Science and the Modern World, p. 73) 

Today the study of complex, self-organizing systems is the "related opposite" of the study of chaos: In self-organizing systems, orderly patterns emerge out of lower-level randomness, in chaotic systems, unpredictable behavior emerges out of lower-level deterministic rules. 

In contemporary accounts of complex adaptive systems, a great number of independent adaptive agents are interacting with each other in a kaleidoscopic array of simultaneous nonlinear interactions. Chaotic behavior at one level can give rise to distinctive order at the next. Feedback and feedforward loops permeate their decentralized structures.(cf network) The richness of these interactions allows the system as a whole to undergo spontaneous self-organization.

"What is being recognized within the "sciences of complexity" that there are characteristic types of order that emerge from the interactions of many different components...especially in relation to the large-scale aspects of structure or morphology, and the patterns in time that constitute organismic behavior." (Goodwin, p. IX) 

Economies, ecologies, immune systems, developing embryos, and the brain are all examples of complex adaptive systems. The impact of these systems usually centers on the aggregate behaviour, which often feeds back to the individual parts, and evolves over time

These systems are adaptive. They are not chaotic , nor are they merely complex in a static sense. According to some authors, complex systems have somehow aquired the ability to bring chaos and order into a special kind of balance, often called the edge of chaos. (see phase boundary) Stuart Kauffman often characterizes the three dynamical regimes as solid or frozen (the ordered regime, as gas (the chaotic regime), and as liquid (the complex regime). (see phase boundary) These systems face perpetual novelty, and generally operate far from a global optimum or equilibrium. (cf dissipative systems)

In seeking to adapt to changing circumstance, complex systems anticipate. For Stuart Kauffman, "Selection achieves and maintains complex systems poised on the boundary, or edge, between order and chaos. These (poised) systems are best able to coordinate complex tasks and evolve in a complex environmnent." (P.XV)(see evolution) Stephen Pinker echoes Darwin in claiming that " Natural selection is the only scientific explanation of adaptive complexity..." (the characteristic of) "any system composed of many interacting parts where the details of the parts' structure and arrangement suggest design to fulfill some function." 

Murray Gell-Mann describes complex adaptive systems as "pattern-recognition devices that seek to find regularities in experience and compress them into schemata." In his book on vision, David Marr proposes that three levels of understanding are required for the analysis of a complex information-processing system: At one extreme, the top level, is the abstract computational theory of the device, in which the performance of the device is characterized as a mapping from one kind of information to another, the abstract properties of this mapping are defined precisely, and its appropriateness and adequacy for the task at hand are demonstrated. At an intermediate level is the choice of representation for the input and output and the algorithm to be used to transform one into the other. At the other extreme are the details of how the algorithm and representation are realizede physically -- the detailed computer architecture, so to speak. (pp24-5) For Marr, "These three levels are coupled, but only loosely." 

In his book Complexification, John Casti refers to a cartoon of two scientists arguing over the meaning of complexity. In suitably dogmatic terms, the first scientist asserts that "Complexity is just what you don't understand." Responding to the claim, his colleague replies, "You just don't understand complexity." 

Casti's book provides us with formalizations for appreciating the joke. For example, we can see this as another example of the liar paradox or the self-amendment problem in law, understand the humor of the cartoon as a dimensional jump from level N to level N+1,or as a Gödel-like statement of the problems of formalization. (Gregory Bateson makes a similar use of the ideals of logical types and levels of abstraction in his discussion of schizophrenia ) For Casti, the modelling relation of formal systems to natural systems has come upon the problem of complexity and not yet become a science. 

For Casti, complexity is an inherently subjective concept, not that it is purely subjective, but rather that the complexity of a system is a joint property of the system and its interaction with another system, most often an observor and/or controller. Thus complex systems often generate counterintuitive behaviour. He points out that human societies have evolved to the point where traditional maps no longer match our collective experience for very long. "Thus, by coming up with a workable (i.e. scientific) theory of complexity, we can hope to internally represent the experience of change by describing our collective reality as a process. This, in turn, would be a major step toward the development of a framework within which we can begin to understand how to control and manage what our maps tell us are complex processes." (Complexification, p. 273) 

Casti describes objective ways to characterize some aspects of a system's complexity. Algorithmically speaking, for example, complexity is generally measured by compressibility, eg. by the shortest program that will cause a Turing machine to print out a number. randomness is considered to be incompressible, that is to say the description of the object is no more compact than the object itself. Thus to explore a complex system, there are no shortcuts to running the system itself. "Logical depth" is another way of describing this computational complexity. Logical depth is defined as the time needed (measured in number of computational steps) for the shortest possible program to generate the structure. A true deep structure is thus characterized by the mathematical property that it cannot be generated by fewer computational steps via a simulation on any other computer. 

Casti believes that the formalization of complexity requires the explicit recognition of the observor. For him the complexity of the system as seen by the observor is directly proportional to the number of inequivalent descriptions of the system that the observor can generate, when one description bifurcates from another. The choice of descriptions of complex systems depend on contexts of understanding and interpretation. 

Niklas Luhmann calls autopoetic systems "complex self-referential systems." These systems make and continue to make a difference between the system and its environment. They are operationally closed. Luhmann describes complex self-referential systems as "hypercomplex," in the ways that they make distinctions between themselves and their environments, but also internalize the distinction in a paradoxical "re-entry" of the distinction into the distinguished.