Jump to content

Complexity: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Ripper234 (talk | contribs)
Link to Computational Complexity
Line 98: Line 98:
*[http://euromed.blogs.com Complexity, Knowledge and Learning] - A blog on complexity, knowledge and learning
*[http://euromed.blogs.com Complexity, Knowledge and Learning] - A blog on complexity, knowledge and learning
*[http://www.santafe.edu Santa Fe Institute] - an institute dedicated to complexity research
*[http://www.santafe.edu Santa Fe Institute] - an institute dedicated to complexity research
*[http://www.swemorph.com/pdf/it-webart.pdf Modelling Complex Socio-Technical Systems using Morphological Analysis] From the [http://www.swemorph.com Swedish Morphological Society]


[[Category:Abstraction]]
[[Category:Abstraction]]

Revision as of 21:55, 3 April 2006

Complexity is the opposite of simplicity.

Complexity in systems or behaviour is often described as what is "on the edge of chaos" - between order and randomness.

Study of complexity

Complexity has always been a part of our environment, and therefore many scientific fields have dealt with complex systems and phenomena. Indeed, some would say that only what is somehow complex - what displays variation without being purely random - is worthy of interest.

While this has led some fields to come up with specific definitions of complexity, there is a more recent movement to regroup observations from different fields in order to study complexity in itself, whether it appears in anthills, human brains, or stock markets.

Complex systems

Systems theory has long been concerned with the study of complex systems (In recent times, complexity theory and complex systems have also been used as names of the field). These systems can be biological, economic, technological, etc.

Complex systems tend to be high-dimensional and non-linear, but may exhibit low dimensional behaviour.

Complex mechanisms

Recent development around artificial life, evolutionary computation and genetic algorithms have led to an increasing emphasis on complexity and complex adaptive systems.

Complex simulations

In social science, the study on the emergence of macro-properties from the micro-properties, also known as macro-micro view in sociology. The topic is commonly recognized as social complexity that is often related to the use of computer simulation in social science, i.e.: computational sociology

Complex behaviour

The behaviour of a complex system is often said to be due to emergence and self-organization

Chaos theory has investigated the sensitivity of systems to variations in initial conditions as one cause of complex behaviour.

One of the main claims in Stephen Wolfram's book A New Kind of Science is that such behaviour can be generated by simple systems, such as the rule 110 cellular automaton.

Complexity in data

In information theory, algorithmic information theory is concerned with the complexity of strings of data.

Complex strings are harder to compress. While intuition tells us that this may depend on the codec used to compress a string (a codec could be theoretically created in any arbitrary language, including one in which the very small command "X" could cause the computer to output a very complicated string like '18995316'"), any two Turing-complete languages can be implemented in each other, meaning that the length of two encodings in different languages will vary by at most the length of the "translation" language - which will end up being negligible for sufficiently large data strings.

It should be noted that these algorithmic measures of complexity tend to assign high values to random noise. However, those studying complex systems would not consider randomness as complexity.

Information entropy is also sometimes used in information theory as indicative of complexity.

Information entropy and thermodynamic entropy are related through statistics, an area that is only emerging recently. The connection is not well determined, but clearly range from trivial inessentials to possibly great value. Entropy, for instance, can be eliminated as a common variable in both the information entropy equation S = - k log (p) and G = U - T*S (archaic notation...) but then the information content immediately disappears, and only by chasing the equations again, will it reappear often in a somewhat different way like Proteus. Some indication exists that, pari passu, computer programs become slowly less reliable as their size increases because the statistical and geopolitical meanings of individual bits and bytes of information comprising the program are lost at the extreme of their implications in the environmental thermodynamics of the system. This can be resolved better as computer design becomes better adapted to the ordinary covariant relativistic space-time frame (x, y, z,-ct).

Complexity of problems

Computational complexity theory is the study of the complexity of problems - that is, the difficulty of solving them. Problems can be classified by complexity class according to the time it takes for an algorithm to solve them as function of the problem size. For example, the travelling salesman problem can be solved in time (where n is the size of the network to visit).

Specific meanings

In several scientific fields, "complexity" has a specific meaning :

Quotes about complexity

  • "The complexity of a document is proportional to the number of fingers that you need to read it." DeMarco's Law is a paraphrase from Tom DeMarco. For example, 'The complexity of a computer program is proportional to the number of fingers you need to read it.'
  • "The essence of tyranny is the denial of complexity" Jacob Burkhardt, Swiss historian.
  • "Some days I will say yes, and then odd days it seems things say yes to me. And stranger still, there are those times when I become a yes." (And they are moments of the Calm) -Kevin Hart, quoted by Mark Taylor in 'The Moment of Complexity'

See also

Reference

  • Roger Lewin. Complexity: Life at the Edge of Chaos. Macmillan, 1992.