1. Novelty Is Usually a Disguise
Most technology ideas arrive in the world disguised as novelty.
They are announced as rupture. As invention. As the sudden appearance of something no one could have imagined before the market finally made it possible to say it out loud.
That is rarely how the deepest shifts actually happen.
The more consequential pattern is quieter. A problem appears early, in primitive form. It is partially solved, then broken again under scale. It reappears in new technical clothing. It is misunderstood as tooling, then rediscovered as architecture, then rediscovered again as governance, then again as human limitation. Decades pass. The surface changes. The underlying failure remains.
I have spent much of my professional life in the shadow of one of those failures.
2. The Mismatch Beneath Modern Software
The failure is not a lack of software. It is not a lack of intelligence. It is not even, in the end, a lack of automation. It is the persistent mismatch between how serious human work actually unfolds and how most systems are designed to contain it.
Real work does not happen in tidy, isolated transactions. It stretches across time. It accumulates context unevenly. It leaves traces in conversations, half-finished notes, strategic intentions, unresolved decisions, documents, revisions, and memory. It is recursive. It is interrupt-driven. It is often ambiguous until quite late in the process. And yet most software continues to behave as though work arrives in cleanly bounded units, ready to be filed, routed, and processed inside static tools designed in advance.
That mismatch has never seemed small to me.
I wrote my first line of code in 1981. Since then I have lived across much of the terrain where complex systems meet real institutional pressure: engineering, architecture, platform design, large-scale organizational technology, data-intensive environments, executive operating contexts, and transformation work inside large enterprises. Across all of it, one lesson kept resurfacing: the hardest problems were almost never purely technical. They were problems of fit. Problems of structure. Problems of continuity. Problems of what happens when human intent meets systems that are too rigid to adapt and too loosely governed to trust.
3. Notes, Freedom, and Entropy
One of my earliest encounters with that tension came in 1988, when I met Ray Ozzie and became one of the early enterprise adopters of Lotus Notes. At ARCO International Oil and Gas, we were among the first organizations to roll it out globally. Notes felt like a revelation. It reduced friction. It empowered users. It made it possible for people closer to the work to shape what the work required.
That mattered.
It also revealed something else. The same flexibility that made the system powerful made it fragile when left unconstrained. Capability without discipline has a way of becoming sprawl. Freedom without structure has a way of becoming entropy. The lesson was not that empowerment was wrong. It was that empowerment, by itself, was incomplete.
That theme would return again and again.
4. Complexity at Scale
In 2000, I had the opportunity to create Microsoft’s global account sales business. We managed the relationship with the company’s most complex global customers. The details matter less than the exposure it created. We operated at the edge of complexity: multinational deployments, heterogeneous environments, institutional inertia, competing priorities, enormous technical surfaces, and very little tolerance for simplistic thinking. Work at that scale teaches you to respect the difference between systems that look coherent in a presentation and systems that remain coherent under stress.
In 2003, I became a passionate evangelist for Microsoft’s Dynamic Systems Initiative and its related System Definition Model. I spoke at conferences, briefed customers, and became a true believer in the power of model-based system engineering. That period sharpened something I had only partially understood before. Complexity does not yield to aspiration. It yields, if at all, to disciplined representations, explicit models, and clear relationships between intention and execution. If you want systems to adapt without collapsing, governance cannot be an afterthought. It has to be structural.
5. The First Glimpse of a Different Future
A year later, I took my team to visit one of CODELCO’s copper mines in Chile to observe a serious effort to remotely automate a complex industrial process. Soon after, I was invited to present at the company’s annual technology and innovation summit. What stayed with me from that period was not simply the industrial context, but the line of sight it opened. It became easier to imagine environments in which leadership intent, operational conditions, and adaptive execution could be linked through something more disciplined than ad hoc control and more responsive than static planning.
I did not yet know what that would eventually require. But I could feel the shape of the missing piece.
6. Modularity, Governance, and the Enterprise
In 2011, I joined Warner Music Group as EVP and CTO with a mandate to help replatform the company. We built a large cloud-native engineering organization and implemented a data-enabled services architecture capable of operating at enormous scale. That period deepened another conviction: modularity, data flow, execution, and governance cannot be treated as separate conversations for very long. If they are, the system fragments, and the fragmentation eventually shows up in speed, trust, and institutional adaptability.
That thinking later found public expression in a paper I wrote in 2013 called The Composable Enterprise. The term traveled much farther than I expected. It was taken up by major firms, used in product positioning, and absorbed into the vocabulary of enterprise technology. But what mattered to me was not the spread of the phrase. It was the recognition that a large number of organizations were now confronting the same underlying reality: software had to become more modular, more adaptive, and more governable if it was going to remain useful in a world defined by constant change.
And still, even then, something was missing.
7. The Executive Reality
In 2016, I moved more deeply into consulting and strategy work, advising CEOs, C-suites, and boards across a range of transformation contexts. That vantage point stripped away one of the most persistent illusions in technology: the belief that better information naturally leads to better action. It does not. Between information and action lies continuity. Between insight and execution lies follow-through. Between a decision and its consequences lies an entire unstable terrain of memory, interpretation, interruption, handoff, revision, and drift.
The people doing the most consequential work were often the people living with the most fragmentation.
That was not a tooling inconvenience. It was an operating condition.
8. When the Pieces Began to Align
Then came the recent wave of AI.
Like many people with a long history in systems, I approached it with a mix of fascination and caution. The fascination was obvious. The caution came from pattern recognition. New capabilities often arrive draped in exaggerated claims. Interfaces improve. Demos become smoother. But the underlying architecture of work remains strangely untouched.
For a while, much of what I saw felt like exactly that: acceleration without resolution.
Then, eventually, something shifted.
Not because the hype became more persuasive, but because the pieces began to align in a way that made a deeper answer imaginable. Longstanding tensions that had lived separately in my mind for decades no longer felt unrelated. The old tradeoffs between flexibility and control, adaptation and governance, user proximity and architectural discipline, no longer looked permanent. They looked contingent. Solvable, perhaps not perfectly, but structurally.
That realization did not arrive as a product idea. It arrived as a release of pressure.
9. From Fragments to Synthesis
For most of my career, I had seen fragments of the same unresolved pattern from different angles. Systems that empowered but could not govern. Systems that governed but could not adapt. Systems that scaled but could not stay close to real work. Systems that increased local productivity while deepening systemic incoherence. The names changed. The tools changed. The conditions did not.
What has changed now is not the problem’s existence. It is the possibility of approaching it differently.
The pieces have been accumulating for years:
the democratization of software creation,
the hard lessons of governance failure,
the discipline of model-driven architecture,
the need for modular systems,
the lived reality of cognitive fragmentation in high-stakes work,
the arrival of new machine capabilities that are finally flexible enough to be architecturally interesting.
For a long time, they remained pieces.
Now, for the first time, they feel close to synthesis.
10. The Threshold
I have spent much of my life watching software force human beings to adapt to systems that could not adapt back. I have watched flexibility decay into chaos and control decay into rigidity. I have watched high-value work dissolve into fragments because the surrounding systems were too static, too shallow, or too structurally indifferent to the nature of the work itself.
I do not think that condition is permanent.
For the first time, I believe a different class of system is possible — one that does not merely accelerate tasks, but begins to address the deeper mismatch between serious work and the software that contains it.
That is the threshold I have been moving toward for forty years.
I think we are finally close enough to see it clearly.
