Last week I attended the Almaden Institute, a meeting held annually at IBM's Almaden Research Center, located in Silicon Valley. The theme of this year's Institute was Navigating Complexity: Doing More with Less, and featured a number of talks and panels on a variety of themes relating to complex systems.
I have long been fascinated by complex systems, perhaps because it constitutes the most natural bridge between physics - the discipline I got my advanced degrees in at the University of Chicago in the 1960s, and computer sciences - the discipline I switched to when I joined IBM Research in 1970 and have been working in ever since. In the last few years, as advances in IT and the Internet in particular are enabling us to address increasingly complex problems in fields as diverse as medicine and global supply chains, the study of complex systems is drawing a lot more attention. I’d like to comment on a couple of the talks given at the Institute.
Georgia Tech professor William Rouse, who is Executive Director of the Tennenbaum Institute, presented the final report of a study he led on Complex Engineered, Organizational and Natural Systems, sponsored by NSF's Engineering Directorate. The study included a workshop, which took place in September 2006 and in which I participated, followed by a number of reviews with groups from academia and industry across the US. The objectives of the study were "to understand the underlying issues that cause us to perceive a system to be complex, and formulate a set of fundamental research questions whose pursuit would be appropriate for NSF to sponsor, given its charter to support fundamental research."
In order to help formulate a research agenda on complex systems, the workshop first focused on three different kinds of important problems. The aim was not to attempt to design solutions to these tough problems during a two-day meeting, but rather to prompt discussions on the characteristics of complexity exhibited by each of them and the research needed to address them.
The specific problems were: Infrastructure and Transportation, which looked at proactive monitoring, discovery and prevention of catastrophic behavior in critical infrastructure systems; Health Care Delivery, which explored how best to transform healthcare to value-based competition; and Bacteria Level Design, which examined the feasibility of modifying the control mechanisms of bacteria so that they can be used to deliver a drug to a specific area in an organism.
Let me briefly summarize some of the key findings. While complex physical systems - e.g., biology, ecology, and weather - continue to be of great interest, there was broad agreement that the study of human and social behaviors in systems of all sorts is increasingly important to our understanding of the nature of complexity. There was also general agreement that the most interesting characteristics of complex systems are those that are unpredictable and unintended, as are the adaptive behaviors of the system to these kinds of unplanned changes. There is a whole set of important properties such as robustness, resiliency, flexibility, agility, adaptability and the ability of the system to evolve, which require serious research.
The notion of architectures was pervasive throughout the discussions. Architectures are needed to provide a common foundation for understanding, designing, analyzing and operating complex systems. This is not surprising, as common foundations and constructs, such as frameworks, representations and models, have long been used by system architects and engineers in a variety of disciplines. The overall recommendations for a complex systems research agenda included a set of overarching research questions around architectures: What architectures underlie the physical, behavioral and social phenomena of interest? How can one choose the proper architectures to achieve desired systems characteristics? How can architectures enable resilient, adaptive, agile and evolvable systems? Can architectures be analytically and empirically evaluated before and after development, deployment and systems operation? Are there fundamental limits of information, knowledge, model formulation, observability, controllability, scalability and so on?
Cal Tech professor John Doyle, whose areas of study include complexity in engineering and biology, gave a very interesting talk on “The Architecture of Complexity”. He had also participated in the September NSF workshop.
Professor Doyle's talk covered a wide range of subjects, but one in particular caught my attention. He introduced the concept of Robust yet Fragile (RYF) as a kind of unifying theory of complex systems. "A crucial insight is that both evolution and natural selection or engineering design must produce high robustness to uncertain environments and components in order for systems to persist. Yet this allows and even facilitates severe fragility to novel perturbations, particularly those that exploit the very mechanisms providing robustness, and this Robust yet Fragile (RYF) feature must be exploited explicitly in any theory that hopes to scale to large systems."
In other words, robustness can be viewed as the attempt by humans or nature to organize and bring order to complexity. Yet, the very same mechanisms that provide robustness in a system will allow for, or even facilitate fragilities to emerge. As the system then attempts to deal with these new fragilities, it can in principle get into a robustness-fragility spiral that can challenge its own survival.
Professor Doyle gave examples of how RYF complexity is all around us in biological, societal and technological systems. For example, diabetes, obesity, cancer and auto-immune diseases are side-effects of control mechanisms that have evolved to make biological systems robust and adaptive.
In his own words: "RYF complexity is not confined to biology. The complexity of technology is exploding around us, but in ways that remain largely hidden. Modern institutions and technologies facilitate robustness and accelerate evolution, but enable catastrophes on a scale unimaginable without them (from network and market crashes to war, epidemics, and climate change). Understanding RYF means understanding architecture - the most universal, high-level, persistent elements of organization - and protocols. Protocols define how diverse modules interact, and architecture defines how sets of protocols are organized."
He believes that the inherent tradeoffs between robustness and fragility represent one of the most fundamental challenges in organizing complexity, in our attempt to make the increasingly complex systems we are designing as robust as possible, with the ability to keep going, adapt and evolve when subject to unanticipated events - including those they themselves have helped precipitate.
As everything around us becomes increasingly complex - from our global economies, industries and enterprises, to our technological, scientific and medical achievements - it is critical that we better understand the properties of the complex systems that are part of our lives. Only then can we hope to cope with - if not manage - the fragilities that will naturally accompany complexity.