Complexity - A Guided Tour Part 1
You’re reading novel Complexity - A Guided Tour Part 1 online at LightNovelFree.com. Please use the follow button to get notification about the latest chapter next time when you visit LightNovelFree.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy!
Complexity.
A Guided Tour.
MELANIE MITCh.e.l.l.
PREFACE.
REDUCTIONISM is the most natural thing in the world to grasp. It's simply the belief that "a whole can be understood completely if you understand its parts, and the nature of their 'sum.' " No one in her left brain could reject reductionism.
-Douglas Hofstadter, G.o.del, Escher, Bach: an Eternal Golden Braid.
REDUCTIONISM HAS BEEN THE DOMINANT approach to science since the 1600s. Rene Descartes, one of reductionism's earliest proponents, described his own scientific method thus: "to divide all the difficulties under examination into as many parts as possible, and as many as were required to solve them in the best way" and "to conduct my thoughts in a given order, beginning with the simplest and most easily understood objects, and gradually ascending, as it were step by step, to the knowledge of the most complex."1 Since the time of Descartes, Newton, and other founders of the modern scientific method until the beginning of the twentieth century, a chief goal of science has been a reductionist explanation of all phenomena in terms of fundamental physics. Many late nineteenth-century scientists agreed with the well-known words of physicist Albert Michelson, who proclaimed in 1894 that "it seems probable that most of the grand underlying principles have been firmly established and that further advances are to be sought chiefly in the rigorous application of these principles to all phenomena which come under our notice."
Of course within the next thirty years, physics would be revolutionized by the discoveries of relativity and quantum mechanics. But twentieth-century science was also marked by the demise of the reductionist dream. In spite of its great successes explaining the very large and very small, fundamental physics, and more generally, scientific reductionism, have been notably mute in explaining the complex phenomena closest to our human-scale concerns.
Many phenomena have stymied the reductionist program: the seemingly irreducible unpredictability of weather and climate; the intricacies and adaptive nature of living organisms and the diseases that threaten them; the economic, political, and cultural behavior of societies; the growth and effects of modern technology and communications networks; and the nature of intelligence and the prospect for creating it in computers. The antireductionist catch-phrase, "the whole is more than the sum of its parts," takes on increasing significance as new sciences such as chaos, systems biology, evolutionary economics, and network theory move beyond reductionism to explain how complex behavior can arise from large collections of simpler components.
By the mid-twentieth century, many scientists realized that such phenomena cannot be pigeonholed into any single discipline but require an interdisciplinary understanding based on scientific foundations that have not yet been invented. Several attempts at building those foundations include (among others) the fields of cybernetics, synergetics, systems science, and, more recently, the science of complex systems.
In 1984, a diverse interdisciplinary group of twenty-four prominent scientists and mathematicians met in the high desert of Santa Fe, New Mexico, to discuss these "emerging syntheses in science." Their goal was to plot out the founding of a new research inst.i.tute that would "pursue research on a large number of highly complex and interactive systems which can be properly studied only in an interdisciplinary environment" and "promote a unity of knowledge and a recognition of shared responsibility that will stand in sharp contrast to the present growing polarization of intellectual cultures." Thus the Santa Fe Inst.i.tute was created as a center for the study of complex systems.
In 1984 I had not yet heard the term complex systems, though these kinds of ideas were already in my head. I was a first-year graduate student in Computer Science at the University of Michigan, where I had come to study artificial intelligence; that is, how to make computers think like people. One of my motivations was, in fact, to understand how people think-how abstract reasoning, emotions, creativity, and even consciousness emerge from trillions of tiny brain cells and their electrical and chemical communications. Having been deeply enamored of physics and reductionist goals, I was going through my own antireductionist epiphany, realizing that not only did current-day physics have little, if anything, to say on the subject of intelligence but that even neuroscience, which actually focused on those brain cells, had very little understanding of how thinking arises from brain activity. It was becoming clear that the reductionist approach to cognition was misguided-we just couldn't understand it at the level of individual neurons, synapses, and the like.
Therefore, although I didn't yet know what to call it, the program of complex systems resonated strongly with me. I also felt that my own field of study, computer science, had something unique to offer. Influenced by the early pioneers of computation, I felt that computation as an idea goes much deeper than operating systems, programming languages, databases, and the like; the deep ideas of computation are intimately related to the deep ideas of life and intelligence. At Michigan I was lucky enough to be in a department in which "computation in natural systems" was as much a part of the core curriculum as software engineering or compiler design.
In 1989, at the beginning of my last year of graduate school, my Ph.D. advisor, Douglas Hofstadter, was invited to a conference in Los Alamos, New Mexico, on the subject of "emergent computation." He was too busy to attend, so he sent me instead. I was both thrilled and terrified to present work at such a high-profile meeting. It was at that meeting that I first encountered a large group of people obsessed with the same ideas that I had been pondering. I found that they not only had a name for this collection of ideas-complex systems-but that their inst.i.tute in nearby Santa Fe was exactly the place I wanted to be. I was determined to find a way to get a job there.
Persistence, and being in the right place at the right time, eventually won me an invitation to visit the Santa Fe Inst.i.tute for an entire summer. The summer stretched into a year, and that stretched into additional years. I eventually became one of the inst.i.tute's resident faculty. People from many different countries and academic disciplines were there, all exploring different sides of the same question. How do we move beyond the traditional paradigm of reductionism toward a new understanding of seemingly irreducibly complex systems?
The idea for this book came about when I was invited to give the Ulam Memorial Lectures in Santa Fe-an annual set of lectures on complex systems for a general audience, given in honor of the great mathematician Stanislaw Ulam. The t.i.tle of my lecture series was "The Past and Future of the Sciences of Complexity." It was very challenging to figure out how to introduce the audience of nonspecialists to the vast territory of complexity, to give them a feel for what is already known and for the daunting amount that remains to be learned. My role was like that of a tour guide in a large, culturally rich foreign country. Our schedule permitted only a short time to hear about the historical background, to visit some important sites, and to get a feel for the landscape and culture of the place, with translations provided from the native language when necessary.
This book is meant to be a much expanded version of those lectures-indeed, a written version of such a tour. It is about the questions that fascinate me and others in the complex systems community, past and present: How is it that those systems in nature we call complex and adaptive-brains, insect colonies, the immune system, cells, the global economy, biological evolution-produce such complex and adaptive behavior from underlying, simple rules? How can interdependent yet self-interested organisms come together to cooperate on solving problems that affect their survival as a whole? And are there any general principles or laws that apply to such phenomena? Can life, intelligence, and adaptation be seen as mechanistic and computational? If so, could we build truly intelligent and living machines? And if we could, would we want to?
I have learned that as the lines between disciplines begin to blur, the content of scientific discourse also gets fuzzier. People in the field of complex systems talk about many vague and imprecise notions such as spontaneous order, self-organization, and emergence (as well as "complexity" itself). A central purpose of this book is to provide a clearer picture of what these people are talking about and to ask whether such interdisciplinary notions and methods are likely to lead to useful science and to new ideas for addressing the most difficult problems faced by humans, such as the spread of disease, the unequal distribution of the world's natural and economic resources, the proliferation of weapons and conflicts, and the effects of our society on the environment and climate.
The chapters that follow give a guided tour, flavored with my own perspectives, of some of the core ideas of the sciences of complexity-where they came from and where they are going. As in any nascent, expanding, and vital area of science, people's opinions will differ (to put it mildly) about what the core ideas are, what their significance is, and what they will lead to. Thus my perspective may differ from that of my colleagues. An important part of this book will be spelling out some of those differences, and I'll do my best to provide glimpses of areas in which we are all in the dark or just beginning to see some light. These are the things that make science of this kind so stimulating, fun, and worthwhile both to practice and to read about. Above all else, I hope to communicate the deep enchantment of the ideas and debates and the incomparable excitement of pursuing them.
This book has five parts. In part I I give some background on the history and content of four subject areas that are fundamental to the study of complex systems: information, computation, dynamics and chaos, and evolution. In parts IIIV I describe how these four areas are being woven together in the science of complexity. I describe how life and evolution can be mimicked in computers, and conversely how the notion of computation itself is being imported to explain the behavior of natural systems. I explore the new science of networks and how it is discovering deep commonalities among systems as disparate as social communities, the Internet, epidemics, and metabolic systems in organisms. I describe several examples of how complexity can be measured in nature, how it is changing our view of living systems, and how this new view might inform the design of intelligent machines. I look at prospects of computer modeling of complex systems, as well as the perils of such models. Finally, in the last part I take on the larger question of the search for general principles in the sciences of complexity.
No background in math or science is needed to grasp what follows, though I will guide you gently and carefully through explorations in both. I hope to offer value to scientists and nonscientists alike. Although the discussion is not technical, I have tried in all cases to make it substantial. The notes give references to quotations, additional information on the discussion, and pointers to the scientific literature for those who want even more in-depth reading.
Have you been curious about the sciences of complexity? Would you like to come on such a guided tour? Let's begin.
ACKNOWLEDGMENTS.
I AM GRATEFUL TO THE SANTA FE INSt.i.tUTE (SFI) for inviting me to direct the Complex Systems Summer School and to give the Ulam Memorial Lectures, both of which spurred me to write this book. I am also grateful to SFI for providing me with a most stimulating and productive scientific home for many years. The various scientists who are part of the SFI family have been inspiring and generous in sharing their ideas, and I thank them all, too numerous to list here. I also thank the SFI staff for the ever-friendly and essential support they have given me during my a.s.sociation with the inst.i.tute.
Many thanks to the following people for answering questions, commenting on parts of the ma.n.u.script, and helping me think more clearly about the issues in this book: Bob Axelrod, Liz Bradley, Jim Brown, Jim Crutchfield, Doyne Farmer, Stephanie Forrest, Bob French, Douglas Hofstadter, John Holland, Greg Huber, Ralf Juengling, Garrett Kenyon, Tom Kepler, David Krakauer, Will Landecker, Manuel Marques-Pita, Dan McShea, John Miller, Jack Mitch.e.l.l, Norma Mitch.e.l.l, Cris Moore, David Moser, Mark Newman, Norman Packard, Lee Segel, Cosma Shalizi, Eric Smith, Kendall Springer, J. Clint Sprott, Mick Thomure, Andreas Wagner, and Chris Wood. Of course any errors in this book are my own responsibility.
Thanks are also due to Kirk Jensen and Peter Prescott, my editors at Oxford, for their constant encouragement and superhuman patience, and to Keith Faivre and Tisse Takagi at Oxford, for all their help. I am also grateful to Google Scholar, Google Books, Amazon.com, and the often maligned but tremendously useful Wikipedia.org for making scholarly research so much easier.
This book is dedicated to Douglas Hofstadter and John Holland, who have done so much to inspire and encourage me in my work and life. I am very lucky to have had the benefit of their guidance and friends.h.i.+p.
Finally, much grat.i.tude to my family: my parents, Jack and Norma Mitch.e.l.l, my brother, Jonathan Mitch.e.l.l, and my husband, Kendall Springer, for all their love and support. And I am grateful for Jacob and Nicholas Springer; although their births delayed the writing of this book, they have brought extraordinary joy and delightful complexity into our lives.
PART I.
Background and History.
Science has explored the microcosmos and the macrocosmos; we have a good sense of the lay of the land. The great unexplored frontier is complexity.
-Heinz Pagels, The Dreams of Reason.
CHAPTER 1.
What Is Complexity?
Ideas thus made up of several simple ones put together, I call Complex; such as are Beauty, Grat.i.tude, a Man, an Army, the Universe.
-John Locke, An Essay Concerning Human Understanding.
Brazil: The Amazon rain forest. Half a million army ants are on the march. No one is in charge of this army; it has no commander. Each individual ant is nearly blind and minimally intelligent, but the marching ants together create a coherent fan-shaped ma.s.s of movement that swarms over, kills, and efficiently devours all prey in its path. What cannot be devoured right away is carried with the swarm. After a day of raiding and destroying the edible life over a dense forest the size of a football field, the ants build their nighttime shelter-a chain-mail ball a yard across made up of the workers' linked bodies, sheltering the young larvae and mother queen at the center. When dawn arrives, the living ball melts away ant by ant as the colony members once again take their places for the day's march.
Nigel Franks, a biologist specializing in ant behavior, has written, "The solitary army ant is behaviorally one of the least sophisticated animals imaginable," and, "If 100 army ants are placed on a flat surface, they will walk around and around in never decreasing circles until they die of exhaustion." Yet put half a million of them together, and the group as a whole becomes what some have called a "superorganism" with "collective intelligence."
How does this come about? Although many things are known about ant colony behavior, scientists still do not fully understand all the mechanisms underlying a colony's collective intelligence. As Franks comments further, "I have studied E. burch.e.l.li [a common species of army ant] for many years, and for me the mysteries of its social organization still multiply faster than the rate at which its social structure can be explored."
The mysteries of army ants are a microcosm for the mysteries of many natural and social systems that we think of as "complex." No one knows exactly how any community of social organisms-ants, termites, humans-come together to collectively build the elaborate structures that increase the survival probability of the community as a whole. Similarly mysterious is how the intricate machinery of the immune system fights disease; how a group of cells organizes itself to be an eye or a brain; how independent members of an economy, each working chiefly for its own gain, produce complex but structured global markets; or, most mysteriously, how the phenomena we call "intelligence" and "consciousness" emerge from nonintelligent, nonconscious material substrates.
Such questions are the topics of complex systems, an interdisciplinary field of research that seeks to explain how large numbers of relatively simple ent.i.ties organize themselves, without the benefit of any central controller, into a collective whole that creates patterns, uses information, and, in some cases, evolves and learns. The word complex comes from the Latin root plectere: to weave, entwine. In complex systems, many simple parts are irreducibly entwined, and the field of complexity is itself an entwining of many different fields.
Complex systems researchers a.s.sert that different complex systems in nature, such as insect colonies, immune systems, brains, and economies, have much in common. Let's look more closely.
Insect Colonies.
Colonies of social insects provide some of the richest and most mysterious examples of complex systems in nature. An ant colony, for instance, can consist of hundreds to millions of individual ants, each one a rather simple creature that obeys its genetic imperatives to seek out food, respond in simple ways to the chemical signals of other ants in its colony, fight intruders, and so forth. However, as any casual observer of the outdoors can attest, the ants in a colony, each performing its own relatively simple actions, work together to build astoundingly complex structures that are clearly of great importance for the survival of the colony as a whole. Consider, for example, their use of soil, leaves, and twigs to construct huge nests of great strength and stability, with large networks of underground pa.s.sages and dry, warm, brooding chambers whose temperatures are carefully controlled by decaying nest materials and the ants' own bodies. Consider also the long bridges certain species of ants build with their own bodies to allow emigration from one nest site to another via tree branches separated by great distances (to an ant, that is) (figure 1.1). Although much is now understood about ants and their social structures, scientists still can fully explain neither their individual nor group behavior: exactly how the individual actions of the ants produce large, complex structures, how the ants signal one another, and how the colony as a whole adapts to changing circ.u.mstances (e.g., changing weather or attacks on the colony). And how did biological evolution produce creatures with such an enormous contrast between their individual simplicity and their collective sophistication?
The Brain.
The cognitive scientist Douglas Hofstadter, in his book G.o.del, Escher, Bach, makes an extended a.n.a.logy between ant colonies and brains, both being complex systems in which relatively simple components with only limited communication among themselves collectively give rise to complicated and sophisticated system-wide ("global") behavior. In the brain, the simple components are cells called neurons. The brain is made up of many different types of cells in addition to neurons, but most brain scientists believe that the actions of neurons and the patterns of connections among groups of neurons are what cause perception, thought, feelings, consciousness, and the other important large-scale brain activities.
FIGURE 1.1. Ants build a bridge with their bodies to allow the colony to take the shortest path across a gap. (Photograph courtesy of Carl Rettenmeyer.) Neurons are pictured in figure 1.2 (top). Neurons consists of three main parts: the cell body (soma), the branches that transmit the cell's input from other neurons (dendrites), and the single trunk transmitting the cell's output to other neurons (axon). Very roughly, a neuron can be either in an active state (firing) or an inactive state (not firing). A neuron fires when it receives enough signals from other neurons through its dendrites. Firing consists of sending an electric pulse through the axon, which is then converted into a chemical signal via chemicals called neurotransmitters. This chemical signal in turn activates other neurons through their dendrites. The firing frequency and the resulting chemical output signals of a neuron can vary over time according to both its input and how much it has been firing recently.
These actions recall those of ants in a colony: individuals (neurons or ants) perceive signals from other individuals, and a sufficient summed strength of these signals causes the individuals to act in certain ways that produce additional signals. The overall effects can be very complex. We saw that an explanation of ants and their social structures is still incomplete; similarly, scientists don't yet understand how the actions of individual or dense networks of neurons give rise to the large-scale behavior of the brain (figure 1.2, bottom). They don't understand what the neuronal signals mean, how large numbers of neurons work together to produce global cognitive behavior, or how exactly they cause the brain to think thoughts and learn new things. And again, perhaps most puzzling is how such an elaborate signaling system with such powerful collective abilities ever arose through evolution.
The Immune System.
The immune system is another example of a system in which relatively simple components collectively give rise to very complex behavior involving signaling and control, and in which adaptation occurs over time. A photograph ill.u.s.trating the immune system's complexity is given in figure 1.3.
FIGURE 1.2. Top: microscopic view of neurons, visible via staining. Bottom: a human brain. How does the behavior at one level give rise to that of the next level? (Neuron photograph from brainmaps.org [http://brainmaps.org/smi32-pic.jpg], licensed under Creative Commons [http://creativecommons.org/licenses/by/3.0/]. Brain photograph courtesy of Christian R. Linder.) FIGURE 1.3. Immune system cells attacking a cancer cell. (Photograph by Susan Arnold, from National Cancer Inst.i.tute Visuals Online [http://visualsonline.cancer.gov/details.cfm?imageid=2370].) The immune system, like the brain, differs in sophistication in different animals, but the overall principles are the same across many species. The immune system consists of many different types of cells distributed over the entire body (in blood, bone marrow, lymph nodes, and other organs). This collection of cells works together in an effective and efficient way without any central control.
The star players of the immune system are white blood cells, otherwise known as lymphocytes. Each lymphocyte can recognize, via receptors on its cell body, molecules corresponding to certain possible invaders (e.g., bacteria). Some one trillion of these patrolling sentries circulate in the blood at a given time, each ready to sound the alarm if it is activated-that is, if its particular receptors encounter, by chance, a matching invader. When a lymphocyte is activated, it secretes large numbers of molecules-antibodies-that can identify similar invaders. These antibodies go out on a seek-and-destroy mission throughout the body. An activated lymphocyte also divides at an increased rate, creating daughter lymphocytes that will help hunt out invaders and secrete antibodies against them. It also creates daughter lymphocytes that will hang around and remember the particular invader that was seen, thus giving the body immunity to pathogens that have been previously encountered.
One cla.s.s of lymphocytes are called B cells (the B indicates that they develop in the bone marrow) and have a remarkable property: the better the match between a B cell and an invader, the more antibody-secreting daughter cells the B cell creates. The daughter cells each differ slightly from the mother cell in random ways via mutations, and these daughter cells go on to create their own daughter cells in direct proportion to how well they match the invader. The result is a kind of Darwinian natural selection process, in which the match between B cells and invaders gradually gets better and better, until the antibodies being produced are extremely efficient at seeking and destroying the culprit microorganisms.
Many other types of cells partic.i.p.ate in the orchestration of the immune response. T cells (which develop in the thymus) play a key role in regulating the response of B cells. Macrophages roam around looking for substances that have been tagged by antibodies, and they do the actual work of destroying the invaders. Other types of cells help effect longer-term immunity. Still other parts of the system guard against attacking the cells of one's own body.
Like that of the brain and ant colonies, the immune system's behavior arises from the independent actions of myriad simple players with no one actually in charge. The actions of the simple players-B cells, T cells, macrophages, and the like-can be viewed as a kind of chemical signal-processing network in which the recognition of an invader by one cell triggers a cascade of signals among cells that put into play the elaborate complex response. As yet many crucial aspects of this signal-processing system are not well understood. For example, it is still to be learned what, precisely, are the relevant signals, their specific functions, and how they work together to allow the system as a whole to "learn" what threats are present in the environment and to produce long-term immunity to those threats. We do not yet know precisely how the system avoids attacking the body; or what gives rise to flaws in the system, such as autoimmune diseases, in which the system does attack the body; or the detailed strategies of the human immunodeficiency virus (HIV), which is able to get by the defenses by attacking the immune system itself. Once again, a key question is how such an effective complex system arose in the first place in living creatures through biological evolution.
Economies.
Economies are complex systems in which the "simple, microscopic" components consist of people (or companies) buying and selling goods, and the collective behavior is the complex, hard-to-predict behavior of markets as a whole, such as changes in the price of housing in different areas of the country or fluctuations in stock prices (figure 1.4). Economies are thought by some economists to be adaptive on both the microscopic and macroscopic level. At the microscopic level, individuals, companies, and markets try to increase their profitability by learning about the behavior of other individuals and companies. This microscopic self-interest has historically been thought to push markets as a whole-on the macroscopic level-toward an equilibrium state in which the prices of goods are set so there is no way to change production or consumption patterns to make everyone better off. In terms of profitability or consumer satisfaction, if someone is made better off, someone else will be made worse off. The process by which markets obtain this equilibrium is called market efficiency. The eighteenth-century economist Adam Smith called this self-organizing behavior of markets the "invisible hand": it arises from the myriad microscopic actions of individual buyers and sellers.
Economists are interested in how markets become efficient, and conversely, what makes efficiency fail, as it does in real-world markets. More recently, economists involved in the field of complex systems have tried to explain market behavior in terms similar to those used previously in the descriptions of other complex systems: dynamic hard-to-predict patterns in global behavior, such as patterns of market bubbles and crashes; processing of signals and information, such as the decision-making processes of individual buyers and sellers, and the resulting "information processing" ability of the market as a whole to "calculate" efficient prices; and adaptation and learning, such as individual sellers adjusting their production to adapt to changes in buyers' needs, and the market as a whole adjusting global prices.
The World Wide Web.
The World Wide Web came on the world scene in the early 1990s and has experienced exponential growth ever since. Like the systems described above, the Web can be thought of as a self-organizing social system: individuals, with little or no central oversight, perform simple tasks: posting Web pages and linking to other Web pages. However, complex systems scientists have discovered that the network as a whole has many unexpected large-scale properties involving its overall structure, the way in which it grows, how information propagates over its links, and the coevolutionary relations.h.i.+ps between the behavior of search engines and the Web's link structure, all of which lead to what could be called "adaptive" behavior for the system as a whole. The complex behavior emerging from simple rules in the World Wide Web is currently a hot area of study in complex systems. Figure 1.5 ill.u.s.trates the structure of one collection of Web pages and their links. It seems that much of the Web looks very similar; the question is, why?
FIGURE 1.4. Individual actions on a trading floor give rise to the hard-to-predict large-scale behavior of financial markets. Top: New York Stock Exchange (photograph from Milstein Division of US History, Local History and Genealogy, The New York Public Library, Astor, Lenox, and Tilden Foundations, used by permission). Bottom: Dow Jones Industrial Average closing price, plotted monthly 19702008.
FIGURE 1.5. Network structure of a section of the World Wide Web. (Reprinted with permission from M.E.J. Newman and M. Girvin, Physical Review Letters E, 69,026113, 2004. Copyright 2004 by the American Physical Society.)
Common Properties of Complex Systems.
When looked at in detail, these various systems are quite different, but viewed at an abstract level they have some intriguing properties in common: Complex collective behavior: All the systems I described above consist of large networks of individual components (ants, B cells, neurons, stock-buyers, Website creators), each typically following relatively simple rules with no central control or leader. It is the collective actions of vast numbers of components that give rise to the complex, hard-to-predict, and changing patterns of behavior that fascinate us.
Signaling and information processing: All these systems produce and use information and signals from both their internal and external environments.
Adaptation: All these systems adapt-that is, change their behavior to improve their chances of survival or success-through learning or evolutionary processes.
Now I can propose a definition of the term complex system: a system in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution. (Sometimes a differentiation is made between complex adaptive systems, in which adaptation plays a large role, and nonadaptive complex systems, such as a hurricane or a turbulent rus.h.i.+ng river. In this book, as most of the systems I do discuss are adaptive, I do not make this distinction.) Systems in which organized behavior arises without an internal or external controller or leader are sometimes called self-organizing. Since simple rules produce complex behavior in hard-to-predict ways, the macroscopic behavior of such systems is sometimes called emergent. Here is an alternative definition of a complex system: a system that exhibits nontrivial emergent and self-organizing behaviors. The central question of the sciences of complexity is how this emergent self-organized behavior comes about. In this book I try to make sense of these hard-to-pin-down notions in different contexts.
How Can Complexity Be Measured?
Complexity - A Guided Tour Part 1
You're reading novel Complexity - A Guided Tour Part 1 online at LightNovelFree.com. You can use the follow function to bookmark your favorite novel ( Only for registered users ). If you find any errors ( broken links, can't load photos, etc.. ), Please let us know so we can fix it as soon as possible. And when you start a conversation or debate about a certain topic with other people, please do not offend them just because you don't like their opinions.
Complexity - A Guided Tour Part 1 summary
You're reading Complexity - A Guided Tour Part 1. This novel has been translated by Updating. Author: Melanie Mitchell already has 602 views.
It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.
LightNovelFree.com is a most smartest website for reading novel online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to LightNovelFree.com
- Related chapter:
- Complexity - A Guided Tour Part 2