AI Agents, Mathematics, and Making Sense of Chaos
From Artificiality This Week * Our Gathering: Our Artificiality Summit 2025 will be held on October 23-25 in Bend, Oregon. The
Explore a new framework for predicting emergent phenomena in complex systems. Learn how concepts like causal, informational, and computational closure offer new insights into emergence, from ant colonies to financial markets. Explore practical applications for your work and life.
Stay tuned at the end of this article for a practical guide on how to apply the ideas from this paper to your life and work.
Scientists and philosophers have grappled with the concept of emergence in complex systems for a long time. Emergent phenomena are those fascinating instances where the whole becomes greater than the sum of its parts, which gives rise to unexpected behaviors and properties.
We’re familiar with examples of emergence such as the collective intelligence of ant colonies or the consciousness emerging from billions of neurons, for example. Even though we've been able to observe and describe emergent behaviors, predicting them has remained largely out of reach. This means that when it comes to dealing with complexity we are left being more reactive than proactive.
Recent research has given us some new tools. By bridging concepts from information theory, computational mechanics, and complex systems science, researchers have developed a theoretical framework that, for the first time, provides a formal foundation for understanding and potentially predicting emergent phenomena.
The researchers introduce several key concepts and demonstrate their interrelationships—causal, informational, and computational closure. These concepts help us understand when a system's behavior at one level can be fully explained or predicted without needing to know all the details of what's happening at lower levels. Here are the key concepts in more detail:
Let’s make these abstract concepts more tangible. Imagine a flock of birds. Causal closure would mean that we can predict the flock's future movement just by looking at its current shape and direction, without needing to know the exact position of each bird. Informational closure suggests that knowing the detailed movements of individual birds wouldn't give us any better predictions about the flock's behavior than we can already make from watching the flock as a whole. Computational closure implies that we can describe the rules governing the flock's behavior without referencing individual birds at all.
To help analyze these concepts, the team developed mathematical tools called ε-machines and υ-machines. Think of these as different ways of mapping the same territory—one from ground level and one from a bird's eye view. These "maps" help us understand how processes work at different scales in a complex system.
The researchers also demonstrated that emergent processes form a hierarchical structure. This is like realizing that in a city there are patterns of behavior at the level of individuals, families, neighborhoods, and the city as a whole. Each level builds on the last but is not entirely predictable from the one below.
By connecting their ideas to the concept of "lumpability" in Markov chains, the team built a bridge between their new theory and existing mathematical tools. Markov chains are mathematical systems that model step-by-step processes where the probability of each step depends only on the state attained in the previous step. "Lumpability" in this context refers to the ability to simplify these chains by grouping states together without losing their essential predictive properties. This connection is like discovering that a new, exotic alloy has properties that can be described using well-established principles of metallurgy. It allows insights from the new theory to be interpreted and applied within the framework of familiar, widely-used mathematical models, potentially accelerating the adoption and practical application of these novel concepts across various scientific disciplines.
Perhaps one of their most intuitive contributions is the idea of viewing emergent processes as "software-like" phenomena running on the "hardware" of lower-level systems. This analogy helps us understand how similar patterns can emerge in seemingly different systems. For example, think of a traffic jam. The emergent behavior of traffic flow (the "software") can be similar whether the "hardware" is cars on a highway, people in a crowded coffee shop, or data packets in a computer network. In each case, we see patterns of congestion, flow, and gridlock emerging from the interactions of individual units, despite the underlying systems being vastly different at the microscopic level. This perspective helps explain why insights from one field, like traffic engineering, can sometimes be applied to seemingly unrelated areas, such as managing data flow in telecommunications networks.
Finally, the researchers demonstrated the broad applicability of their framework by applying it to various systems. They showed how these ideas can be used to understand phenomena as diverse as the behavior of ant colonies, the dynamics of financial markets, and the emergence of consciousness from neural activity.
What I find fascinating about this work is that the researchers—themselves from a wide diversity of fields—created a mathematical and conceptual toolkit for identifying, characterizing, and potentially predicting emergent phenomena in complex systems. This offers promise for us being able to ground the often nebulous concept of emergence in a more formal and quantitative theory that we can apply to more situations.
A Practical Guide to Applying Emergence Theory to Complex Problems
Here's how you can apply these concepts to tackle complex problems:
By applying these approaches, informed by the concepts of causal, informational, and computational closure, you can develop a better understanding of complex systems in your field. This can lead to more effective analysis, prediction, and management of emergent phenomena, whether you're dealing with organizational behavior, ecosystem dynamics, or technological systems.
The Artificiality Weekend Briefing: About AI, Not Written by AI