What Exactly Is Complexity?
Take apart a mechanical watch. Lay out every gear, spring, and jewel on a table. Study each piece carefully, understand its shape and function, then reassemble the whole thing. It will work exactly as before. You’ve understood the watch completely by understanding its parts. Now try the same experiment with a living cell. Disassemble a cell into its component molecules — the lipids, proteins, nucleic acids, sugars — and lay them out. Study each one in isolation. You will learn a great deal about chemistry, but you will learn almost nothing about what makes a cell alive. Reassembly is not an option. Something about the cell exists only when the parts are together, interacting, in a particular dynamic arrangement.
The watch is complicated. The cell is complex. We use these words interchangeably in everyday speech, but they point to a distinction that matters enormously — and that the sciences have struggled to make precise for decades.
Complicated vs. Complex
A complicated system has many parts, but the whole is essentially the sum of those parts. It is predictable from its specification. Given the blueprint, we can deduce behavior. A jet engine is complicated — thousands of components, extraordinary engineering — but an engineer with the right documentation can predict exactly how it will perform. The key feature: we can take it apart, study each piece in isolation, and the knowledge transfers back to the whole. The parts explain the system.
A complex system is different in kind, not just degree. Its wholes have properties that none of its parts possess. A neuron does not think. A water molecule is not wet. A single trader is not a market price. These emergent properties — thought, fluidity, pricing — exist only at the level of the collective, arising from interactions among the parts. Studying the parts in isolation, no matter how carefully, does not give us the whole. In other words, a complex system is one that cannot be fully understood by decomposition. Something essential is lost when we pull it apart.
The Measurement Problem
If complexity is real and important, we ought to be able to measure it. But this has proven surprisingly difficult. Several candidates have been proposed over the decades, and each captures part of the picture while missing other parts.
Kolmogorov complexity, named after the Soviet mathematician Andrey Kolmogorov, measures the length of the shortest computer program that can produce a complete description of the system. A simple pattern (like “repeat 01 a thousand times”) has low Kolmogorov complexity because a short program can generate it. A random string has high Kolmogorov complexity because the shortest program is basically the string itself — there are no shortcuts. The problem is that this measure treats randomness as maximally complex, which doesn’t match our intuitions. A random string feels chaotic, not complex. And the measure doesn’t capture emergence at all.
Entropy, from information theory, measures uncertainty or disorder. High entropy means high randomness. But complexity isn’t randomness. A truly random system has no structure, no patterns, no interesting behavior. A crystal has very low entropy — highly ordered, highly predictable — but we wouldn’t call it complex either. Complexity seems to live in a middle regime: neither perfectly ordered nor perfectly random. Structured, but not in a simple way.
Murray Gell-Mann, the Nobel Prize-winning physicist who co-proposed the quark model, offered a more refined concept he called effective complexity: the length of a concise description of a system’s regularities (as opposed to its random features). This captures the “neither trivial nor random” intuition nicely. A crystal has low effective complexity because its regularities are simple to describe. A random string has low effective complexity because it has no regularities. A living cell has high effective complexity because it has an enormous number of intricate, non-trivial regularities. This feels closer to what we mean by “complex.”
Thermodynamic depth, another proposal, measures something different again: the amount of work, history, or processing required to produce the system. A crystal can form in minutes. A living cell required billions of years of evolution. Thermodynamic depth connects complexity to process rather than structure — it tells us not just what the system looks like now, but how much it “cost” to produce.
A Profile, Not a Number
Each of these measures captures something real, but none captures everything. This suggests that complexity may not reduce to a single number at all. It may be better understood as a profile — a collection of properties that together characterize what makes a system genuinely complex, as opposed to merely complicated or merely random.
Four dimensions stand out as essential components of this profile.
Non-decomposability. A genuinely complex system cannot be fully understood by studying its parts in isolation. Emergent properties are present — properties that belong to the whole and vanish when we pull the system apart. This is the most fundamental marker, and it is what distinguishes complex from complicated.
Non-trivial structure. The system is neither perfectly ordered nor perfectly random. It occupies the middle regime — rich with patterns, but patterns that resist simple description. A genome is not a crystal (simple order) and not noise (no order). It is something in between: structured in deep, layered, context-dependent ways.
Computational depth. The system required a non-trivial amount of processing to produce. Living systems carry the imprint of billions of years of evolution. A brain carries the imprint of decades of learning and development. This dimension captures the idea that complexity is not just a snapshot but a history — the product of a long, rich process.
Correlation structure. The components of the system are not independent. What one part does is meaningfully connected to what other parts do. This connects directly to emergence: when correlations between components cross a threshold, new collective properties appear. The depth and richness of this correlation structure — how tightly integrated the parts are, over how many scales — is a powerful indicator of genuine complexity.
The Distinction That Matters
Genuinely complex systems, then, are those that have emergent properties, possess non-trivial correlation structure, resist simple compression, and required non-trivial processes to produce. Merely complicated systems have many parts but are decomposable and lack emergence. A Boeing 747 is enormously complicated. A rainforest is genuinely complex. The 747 can be understood from its blueprints. The rainforest cannot.
This distinction matters because the tools that work for complicated systems — reductionism, specification, top-down design — often fail catastrophically when applied to complex ones. We can design a jet engine from first principles. We cannot design an ecosystem, a culture, or a mind. Complex systems must be grown, cultivated, or allowed to emerge. Understanding why requires understanding what complexity actually is — not just a vague sense that “it’s really hard,” but a principled account of what makes these systems fundamentally different in kind.
The profile approach may not give us a single satisfying number, but it gives us something more useful: a checklist for recognizing genuine complexity when we encounter it, and a warning that the usual tools of analysis may not apply.
How This Was Decoded
Pattern recognition across information theory (Kolmogorov complexity, entropy), physics (Gell-Mann’s effective complexity, thermodynamic depth), and systems theory (emergence, correlation thresholds). The inference that no single scalar measure suffices came from observing that each proposed measure captures a different genuine aspect of complexity while missing others. The profile approach resolves this by treating complexity as multi-dimensional rather than forcing it into a single number. Cross-verified against emergence theory (correlation structure), information theory (compressibility), and evolutionary biology (computational depth as historical process).
Want the compressed, high-density version? Read the agent/research version →