Announcing the Artificiality Summit 2025!
Announcing the Artificiality Summit 2025! Don't miss the super early bird special at the end of the email...
Complex change recognizes that change is non-linear, emergent, and deeply interconnected with the system in which it occurs. This is even more important as we adapt to the complexity of generative AI.
Across the Artificiality community, organizations are grappling with a new organizational dynamic characterized by complex change. Change management itself is changing due to the need to adapt to generative AI. Rather than being a linear, sequential process, with each step building upon the previous one to drive successful change, change is getting much more loopy to adjust to the emergence from our new human-machine systems.
In this context, we need a new approach to change management—one that recognizes the inherent complexity of the systems we are part of and the role we play in shaping their evolution.
Treating organizations as if they were machines, with predictable parts and linear cause-and-effect relationships, has worked for a long time. But in reality, organizations are complex adaptive systems, with countless interrelated elements that are constantly evolving and influencing one another. By attempting to control change from the top down, leaders fail to account for the emergent properties of these systems, and risk creating unintended consequences that can derail plans.
Complex change recognizes that change is non-linear, emergent, and deeply interconnected with the system in which it occurs. This is even more so in the age of artificial intelligence and machine learning. These technologies are fundamentally altering the way we work and interact with one another. Predictive models that once seemed reliable are made obsolete as the systems they are meant to describe evolve in unexpected ways.
And generative AI is creating entirely new categories of content and behavior, blurring the lines between human and machine agency. And as these systems become more autonomous and interconnected, the potential for unintended consequences grows exponentially.
Most change programs contain an implicit assumption: that we can predict and control the future from an external vantage point. Complexity science tells us that this is not how the world works. Even if change advocates follow all the right steps such as building awareness, fostering desire, educating people, and rewarding and reinforcing required skills and behavior, it just isn't enough because human systems do not march towards towards a predetermined end.
Consider the shift towards embracing more data-driven approaches. We excel at guiding others through this transformation and making data-driven decision-making a key part of their strategy. We're actually pretty awesome at it. Yet even when people have access to high quality data we still see a group struggle with the analytics process and with making timely and "good enough" decisions.
Why?
The answer is that when we try to predict the future based on past data, we are essentially looking through a pane of glass, assuming that the world will continue to behave as it has before. We fall into the trap of believing that we can accurately forecast the future. The moment we act on our predictions, we're no longer just observers, we're part of the mix, and our actions change the game. It's like our moves end up bending the rules we based them on. By acting, we outdate our own predictions, especially as we're the ones sparking the change.
So, if we don't fully automate the system, turning everything into data, people will inevitably bring in more complexity, introducing new layers and elements that can't be easily quantified. This means we end up placing ourselves right back into the heart of decision-making, even when it's supposed to be data-driven.
Complexity science tells us there's another layer here too—the decision system. Decision making is like traffic: the key to a consistent flow of decisions is that everyone is moving at about the same rate. If one group is moving fast but relies on another part of the organization that doesn't move so fast, the whole decision making system becomes congested.
This phenomenon is particularly noticeable within development teams. The ease of coding with the assistance of generative AI has sometimes shifted bottlenecks in the development process to unexpected areas. The rapid pace of one team's progress can lead to unanticipated constraints elsewhere.
Overall we see a new pattern emerging. Before generative AI, software imposed rules and order. It is the ultimate top-down controller. No matter how much sales people want to be entrepreneurial, a change program prefaced on Salesforce enforces certain rules and policy. Generative AI, with its variability and creative output introduces more agency—hence human judgment—into the system.
AI adoption is not just about data, automation, and task substitution, it's about how we blend human ingenuity with machine knowledge. People's expectations around how change should unfold are running straight into the reality of how complex the coupling of human intuition and machine intelligence really is. We can't escape it: this complexity is a core part of our current interaction with technology. Regardless of how advanced our algorithms become or the size of our data pools, there's a built-in limitation. Machines can't fully grasp the relationship between the system and those of us making the decisions.
As AI becomes more capable of acting independently (agentic AI) and responding to human actions, the dynamic will shift again. Machines which gain autonomy and a better understanding of human behavior mean the combined results are influenced by both human decisions and machine actions. We are no longer simply interacting with machines: we are part of a complex, dynamic network where each action taken by an AI agent ripples through the system, influencing human behavior and vice versa.
Complex change involves a new mindset. We are not outside the system, looking in. We are an integral part of it, and so are the data, algorithms, and outputs of AI. Change isn't planned or even designed. It emerges from the interactions between countless interconnected agents. If change emerges from complex interactions, then effectively managing it requires a deep grasp of the system's nature—its features, movements, connections, layers, dimensions, and critical junctures. With this knowledge, we then focus on adaptation: craft motivators, feedback mechanisms, linkages, and path dependencies that guide the process.
Complex change draws on a wide range of disciplines, from complexity science and systems thinking to psychology and organizational behavior. In a complex system, no single agent has complete information or authority. Success depends on collective intelligence and collective change, where the collective includes machines.
Ironically, in the age of AI with its promise of predictability, humans will be the ones who have to let go of certainty. Our human ability to make decisions when the world is paradoxical, unpredictable, and ambiguous remains our unique and comparative advantage. Leveraging this ability within a sophisticated network that encompasses both humans and machines has become the crucial factor for driving change.
The Artificiality Weekend Briefing: About AI, Not Written by AI