Monday, March 28, 2011

Some thoughts about complexity

Information and order are indirectly correlated, and this has to do with how the terms are defined. The more ordered something is, the less information you need to describe it; the more disordered something is, the more information you need to describe it. This runs somewhat counter to the everyday, casual use of the words “information” and “order”—at least for me—so I use the example of a ransacked library to keep the concept straight in my mind. A ransacked library is highly disordered; all books have been thrown off the shelves and lie as mountainous heaps on the floor. A concise Dewey Decimal System call number becomes insufficient for locating a book, especially if the book has been ripped into many pieces. Rather, the book and any of its scattered pages must be individually specified by precise directions: e.g., the twelfth book from the bottom in the heap located twenty-six inches immediately south of the corner of the east end of the shelf formerly containing D-E children's fiction.

If information and order are something of opposites, what then is the relationship between information and complexity? Complexity is closely tied to order, with complexity appearing to arise in systems possessing well ordered heterogeneity, so are information and complexity something of opposites, too? Could complexity be defined in part as a lack of information? This certainly runs counter to casual use of the terms!

Complex things we readily observe in the world appear to be hallmarks of embodied information, not islands freer from information than the simpler things of the world. Imagine the space shuttle and the immense web of technical specifications and knowledge needed to design, build, maintain, and operate it. How could complexity ever be oppositional to information?

This poses a problem with the casual notion of complexity: we tend to think of bigger things as being more complex, on average, than smaller things. If we're aiming to isolate the essence of complexity itself, this is no good; we're letting additional characteristics—e.g., bigness— cloud whatever sense we can make of complexity. In the case of the space shuttle, it seems complex because it is big. (It also seems complex because it's high-tech, but that's a whole other point.) As far as embodied information goes, I imagine the shuttle is a good deal easier to describe in full than nearly any other billions-of-dollars set of things you care to name. I write that with some measure of certainty because the shuttle is fully described—or very nearly so—through endless stacks of engineering specs.

Can complexity be normalized to capture its essence better? That is, if two things, A and B, are equally complex and A is twice as big as B, then we would say that A possesses half the quantity of normalized complexity as does B. But how should we measure size? In grams? In dollars? In seconds of existence? This is not clear. But mass seems like a good place to start.

A few Google searches for determining the number of parts in a space shuttle didn't yield any results—maybe that's not public information—but 747's contain on the order a million separate parts, so I estimate that the shuttle has on the order ten or a hundred million parts. Contrast that to the nervous system of a cockroach, which has on the order of a million neurons. But a cockroach's nervous system is far, far smaller than the shuttle. Its “complexity density,” as determined by its part-to-mass ratio, is therefore greater. To some extent this makes sense in that we can design and understand shuttles but as yet lack the ability to design and understand cockroach brains, though we may be getting close. Perhaps complexity density explains why a big but less-complexity-dense entity like the shuttle better yields itself to brute force engineering than do organic, well ordered entities such as cockroaches.

No comments: