The term “emergence” comes
from the Latin verb emergo which means to arise, to rise up, to come up or to
come forth. The term was coined by G. H. Lewes in Problems of Life and Mind
(1875) who drew the distinction between emergent and resultant effects.
Effects are resultant if
they can be calculated by the mere addition or subtraction of causes operating
together, as with the weight of an object, when one can calculate its weight
merely by adding the weights of the parts that make it up. Effects are emergent
if they cannot be thus calculated, because they are qualitatively novel
compared to the causes from which they emerge. For Lewes, examples of such
emergent effects are mental properties that emerge from neural processes yet
are not properties of the parts of the neural processes from which they
emerge. In Lewes’ work, three essential
features of emergence are laid out. First, that emergentism is a theory about
the structure of the natural world; and, consequently, it has ramifications
concerning the unity of science. Second, that emergence is a relation between
properties of an entity and the properties of its parts. Third, that the
question of emergence is related to the question of the possibility of
reduction. These three features will structure this article’s discussion of
emergence.
1.
The British Emergentists
The group of emergentists
that Brian McLaughlin (1992) has dubbed the “British emergentists” were the
first to make emergence the core of a comprehensive philosophical position in
the second half of the nineteenth century and the beginning of the twentieth
century. A central question at that time was whether life, mind and chemical
bonding could be given a physical explanation and, by extension, whether
special sciences such as psychology and biology were reducible to more “basic”’
sciences and, eventually, to physics. Views were divided between the
reductionist mechanists and the anti-reductionist vitalists. The mechanists
claimed that the properties of an organism are resultant properties that can be
fully explained, actually or in principle, in terms of the properties and
relations of its parts. The vitalists claimed that organic matter differs
fundamentally from inorganic matter and that what accounts for the properties
of living organisms is not the arrangement of their constitutive physical and
chemical parts, but some sort of entelechy or spirit. In this debate the
emergentists proposed a middle way in which, against the mechanists, the whole
is more than just the sum and arrangement of its parts yet, against the
vitalists, without anything being added to it “from the outside”—that is, there
is no need to posit any mysterious intervening entelechy to explain irreducible
emergent properties.
Though the views of the
British emergentists differ in their details we can generally say they were
monists regarding objects or substances in as much as the world is made of
fundamentally one kind of thing, matter. However, they also held that at different
levels of organization and complexity matter exhibits different properties that
are novel relative to the lower levels of organization from which they emerged
and this makes the emergentist view one of property dualism (or pluralism). It
should also be noted that the British emergentists identified their view as a
naturalist position firstly because whether something is emergent or not is to
be established or rejected by empirical evidence alone, and secondly because no
extra-natural powers, entelechies, souls and so forth are used in emergentist
explanations. The main texts of this tradition of the so-called “British
emergentists” are J. S. Mill’s System of Logic, Samuel Alexander’s Space, Time
and Deity, C. Lloyd Morgan’s Emergent Evolution and C. D. Broad’s The Mind and
its Place in Nature. Beyond these emergentists, traditional brands of
emergentism can be found in the work of R. W. Sellars (1922), A. Lovejoy
(1927), Roger Sperry (1980, 1991), Karl Popper and John Eccles (1977) and
Michael Polanyi (1968).
a.
J. S. Mill
Though he did not use the
term ‘emergence,’ it was Mill’s System of Logic (1843) that marked the
beginning of British emergentism.
Mill distinguished between
two modes of what he called “the conjoint action of causes,” the mechanical and
the chemical. In the mechanical mode the effect of a group of causes is nothing
more than the sum of the effects that each individual cause would have were it
acting alone. Mill calls the principle according to which the whole effect is
the sum of the effects of its parts the “principle of composition of causes”
and illustrates it by reference to the vector sum of forces. The effects thus
produced in the mechanical mode are called “homopathic effects” and they are
subject to causal “homopathic laws.” Mill contrasts the mechanical mode with
the chemical mode in which the principle of composition of causes does not
hold. In the chemical mode causal effects are not additive but, instead, they
are “heteropathic” which means that the conjoint effect of different causes is
different from the sum the effects the causes would have in isolation. The
paradigmatic examples of such effects were, for Mill, the products of chemical
reactions which have different properties and effects than those of the
individual reactants. Take, for example a typical substitution reaction:
Zn + 2HCl → ZnCl2 + H2.
In such a reaction zinc
reacts with hydrogen chloride and replaces the hydrogen in the latter to
produce effects that are more than just the sum of the parts that came together
at the beginning of the reaction. The newly formed zinc chloride has properties
that neither zinc nor hydrogen chloride possess separately.
Mill’s heteropathic effects
are the equivalent of Lewes’ emergent effects, whereas homopathic effects are
the equivalent of Lewes’ resultants. Heteropathic effects are subject,
according to Mill, to causal “heteropathic” laws which, though now relative to the
laws of the levels from which they emerged, do not counteract them. Such laws
are found in the special sciences such as chemistry, biology and psychology.
b.
Samuel Alexander
In Space, Time and Deity
(1920), Samuel Alexander built a complex metaphysical system that has been
subject to a number of different interpretations. As we shall see, Alexander in
effect talks of different levels of explanation as opposed to the more robust
ontological emergence we find in the works of the other British emergentists.
According to Alexander, all
processes are physico-chemical processes but as their complexity increases they
give rise to emergent qualities that are distinctive of the new complex
configurations. These are subject to special laws that are treated by autonomous
special sciences that give higher-order explanations of the behavior of complex
configurations. One kind of such emergent qualities is mental qualities (others
are biological and chemical qualities). Since for Alexander all processes are
physico-chemical processes, mental processes are identical to neural processes.
However Alexander claims that mental qualities are distinctive of higher-order
configurations. Furthermore, Alexander claims, mental qualities are not
epiphenomenal. A neural process that lost its mental qualities would not be the
same process because it is in virtue of its mental qualities that the
“nervous”—neural—process has the character and effects that it has. So though
emergent qualities are co-instantiated in one instance in a physico-chemical
process, they are distinct from that process due to their novel causal powers.
Alexander also holds that
emergent qualities and their behavior cannot be deduced even by a Laplacean
calculator from knowledge of the qualities and laws of the
lower—physiological—order. To be precise, though a Laplacean calculator could
predict all physical processes (and hence all mental processes, since mental
processes are physical processes) he would not be able to predict the emergent
qualities of those events because their configuration, though being in its
entirety physico-chemical, exhibits different behavior from the kind the
physico-chemical sciences are concerned with and this behavior is, in turn,
captured by emergent laws. Hence the emergence of such qualities should be
taken as a brute empirical fact that can be given no explanation and should be
accepted with “natural piety”. However it should be noted here that Alexander
leaves open the possibility that, if chemical properties were to be reduced without
residue to physico-chemical processes, then they would not be emergent, and he
adds that the same holds for mental properties.
c.
C. Lloyd Morgan
In Emergent Evolution (1923)
(and subsequently in Life, Spirit and Mind [1926] and The Emergence of Novelty
[1933]) the biologist C. Lloyd Morgan introduced the notion of emergence into
the notion of the process of evolution and maintained that in the course of
evolution new properties and behaviors emerge (like life, mind and reflective
thought) that cannot be predicted from the already existing entities they
emerged from. Taking off from Mill and Lewes, Morgan cites as the paradigmatic
case of an emergent phenomenon the products of chemical reactions that are
novel and unpredictable. These novel properties, moreover, are not merely
epiphenomenal but bring about “a new kind of relatedness”—new lawful
connections—that affects the “manner of go” of lower-level events in a way that
would not occur had they been absent. Thus emergent properties are causally autonomous
and have downward causal powers.
d.
C. D. Broad
The last major work in the
British emergentist tradition and, arguably, the historical foundation of
contemporary discussions of emergence in philosophy, was C. D. Broad’s Mind and
Its Place in Nature (1925).
Broad identified three
possible answers to the question of how the properties of a complex system are
related to the properties of its parts. The “component theory” of the
vitalists, the reductive answer of the mechanists and the emergentist view that
the behavior of the whole cannot in principle be deduced from knowledge of the
parts and their arrangement. From this
latter view—Broad’s own—it follows that contrary to the mechanist’s view of the
world as homogeneous throughout, reality is structured in aggregates of
different order. Different orders in this sense exhibit different
organizational complexity and the kinds that make up each order are made up of
the kinds to be found in lower orders. This lack of unity is, in turn,
reflected in the sciences, where there is a hierarchy with physics at the lower
order and then ascending chemistry, biology and psychology—the subject matter
of each being properties of different orders that are irreducible to properties
of the lower orders. According to Broad these different orders are subject to
different kinds of laws: trans-ordinal laws that connect properties of adjacent
orders and intra-ordinal laws that hold between properties within the same
order. Trans-ordinal laws, Broad writes, cannot be deduced from intra-ordinal
laws and principles that connect the vocabularies of the two orders between
which they hold; trans-ordinal laws are irreducible to intra-ordinal laws and,
as such, are fundamental emergent laws—they are metaphysical brute facts.
Broad considered the
question whether a trans-ordinal law is emergent to be an empirical question.
Though he considered the behavior of all chemical compounds irreducible and
thus emergent, he admitted, like Alexander, that if one day it is reduced to
the physical characteristics of the chemical compound’s components it will not
then count as emergent. However, unlike Alexander, he did not consider the same
possible concerning the phenomenal experiences that “pure”—secondary—qualities
of objects cause in us. Broad calls trans-ordinal laws that hold between
physical properties and secondary qualities “trans-physical laws”. Though he is
willing to grant that it could turn out that we mistakenly consider some
trans-ordinal laws to be emergent purely on the basis of our incomplete
knowledge, trans-physical laws are necessarily emergent—we could never have
formed the concept of blue, no matter how much knowledge we had of colors,
unless we had experienced it. Broad puts
forward an a priori argument to this effect that can be seen as a precursor of
the knowledge argument against physicalism. These qualities, he says, could not
have been predicted even by a “mathematical archangel” who knows everything
there is to know about the structure and working of the physical world and can
perform any mathematical calculation—they are in principle irreducible, only
inductively predictable and hence emergent.
In this we see that Broad’s
emergentism concerning the phenomenal experience of secondary qualities is not
epistemological (as is sometimes suggested by his writings) but is a
consequence of an ontological distinction of properties. That is, the
impossibility of prediction which he cites as a criterion of emergence is a
consequence of the metaphysical structure of the world; the “mathematical
archangel” could not have predicted emergent properties not because of
complexity or because of limits to what can be expressed by lower-level
concepts, but because emergent facts and laws are brute facts or else are laws
that are in principle not reductively explainable.
2.
Later Emergentism
Beginning in the late
1920’s, advances in science such as the explanation of chemical bonding by
quantum mechanics and the development of molecular biology put an end to claims
of emergence in chemistry and biology and thus marked the beginning of the fall
of the emergentist heyday and the beginning of an era of reductionist
enthusiasm. However, beginning with Putnam’s arguments for multiple
realizability in the 1960’s, Davidson’s anomalous monism of the psychophysical
and Fodor’s argument for the autonomy of the special sciences, the identity
theory and reductionism were dealt a
severe blow. Today, within a predominant anti-reductivist monist climate,
emergentism has reappeared in complex systems theory, cognitive science and the
philosophy of mind.
a.
Kinds of Emergence
Because emergent properties
are novel properties, there are different conceptions of what counts as
emergent depending on how novelty is understood, and this is reflected in the
different ways the concept of emergence is used in the philosophy of mind and
in the natural and cognitive sciences. To capture this difference, David
Chalmers (2006) drew the distinction between weak and strong emergence. A
different distinction has been drawn by O’Connor and Wong (2002) between
epistemological and ontological emergence, but this can be incorporated into
the distinction between weak and strong emergence becasue ultimately both
differentiate between an epistemological emergence couched in terms of higher
and lower-level explanations or descriptions and a robust ontological
difference between emergent and non-emergent phenomena. Beyond this, accounts
of emergence differ in whether novelty is understood as occurring over time or
whether it is a phenomenon restricted to a particular time. This difference is
meant to be captured in the distinction between synchronic and diachronic
emergence.
i.
Strong and Weak Emergence
1. Strong Emergence: Novelty
as Irreducibility and Downward Causation
The metaphysically
interesting aspect of emergence is the question of what it takes for there to
be genuinely distinct things. In other words, the question is whether a
plausible metaphysical distinction can be made between things that are “nothing
over and above” what constitutes them and those things that are “something over
and above” their constituent parts. The notion of strong emergence that is
predominant in philosophy is meant to capture this ontological distinction that
was part of the initial motivation of the British emergentists and which is
lacking in discussions of weak emergence.
Though a phenomenon is often
said to be strongly emergent because it is not deducible from knowledge of the
lower-level domain from which it emerged—as was the case for C.D. Broad—what
distinguishes the thesis of strong emergence from a thesis only about our
epistemological predicament is that this non-deducibility is in principle a
consequence of an ontological distinction.
The question then is what sort of novelty must a property exhibit in
order for it to be strongly emergent?
Even reductive physicalists
can agree that a property can be novel to a whole even though it is nothing
more than the sum of the related properties of the parts of the whole. For
instance, a whole weighs as much as the sum of the weights of its parts, yet
the weight of the whole is not something that its parts share. In this sense
resultant systemic properties, like weight, are novel but not in the sense
required for them to be strongly emergent. Also, numerical novelty, the fact
that a property is instantiated for the first time, is not enough to make it
strongly emergent for, again, that would make many resultant properties
emergent, like the first time a specific shape or mass is instantiated in
nature.
For this reason the
criterion often cited as essential for the ontological autonomy of strong
emergents (along with in principle irreducibility or non-deducibility) is
causal novelty. That is, the basic tenet
of strong emergentism is that at a certain level of physical complexity novel
properties appear that are not shared by the parts of the object they emerge
from, that are ontologically irreducible to the more fundamental matter from
which they emerge and that contribute causally to the world. That is, emergent
properties have new downward causal powers that are irreducible to the causal
powers of the properties of their subvenient or subjacent (to be more
etymologically correct) base. Ontological emergentism is therefore typically
committed not only to novel fundamental properties but also to fundamental
emergent laws as was the case with the British emergentists who, with the
exception of Alexander, were all committed to downward causation—that is,
causation from macroscopic levels to microscopic levels. (It should be noted
also that this ontological autonomy of emergents implies the existence of
irreducible special sciences.) Thus Timothy O’Connor (1994) defines strong
emergent properties as properties that supervene on properties of the parts of
a complex object, that are not shared by any of the objects parts, are distinct
from any structural property of the complex, and that have downward causal
influence on the behavior of the complex’s parts.
However, though downward
causal powers are commonly cited along with irreducibility as a criterion for
strong emergence, there is no consensus regarding what is known as “Alexander’s
dictum” (that is, that for something to be real it must have causal powers) and
hence not everyone agrees that strong emergentism requires downward causation.
For example, David Chalmers (2006) who is neutral on the question of
epiphenomenalism, does not take downward causation to be an essential feature
of emergentism. Rather, Chalmers defines a high-level phenomenon as strongly
emergent when it is systematically determined by low-level facts but
nevertheless truths concerning that phenomenon are in principle not deducible
from truths in the lower-level domain. The question is posed by Chalmers in
terms of conceptual entailment failure. That is, emergent phenomena are
nomologically but not logically supervenient on lower-level facts and therefore
novel fundamental laws are needed to connect properties of the two domains.
A different approach is
offered by Tim Crane (2001, 2010) who bases his account of strong emergence on
the distinction between two kinds of reduction: (1) ontological reduction,
which identifies entities in one domain with those in another, more fundamental
one, and (2) explanatory reduction: that is, a relation that holds between
theories aimed at understanding phenomena of one level of reality in terms of a
“lower” level. In other words, one theory, T2, is explanatorily reduced to
another, T1, when theory T1 sheds light on the phenomena treated in T2; that
is, shows from within theory T1 why T2 is true. Crane argues that the
difference between strong emergentism and non-reductive physicalism lies in
their respective attitude to reduction: though both non-reductive physicalism
and emergentism deny ontological reduction, non-reductive physicalism requires
explanatory reduction (at least in principle) whereas the distinguishing
feature of emergentism is that it denies explanatory reduction and is committed
to an explanatory gap. Crane argues that if you have supervenience with
in-principle irreducibility and downward causation then you have dependence
without explanatory reduction and, hence, strong emergence.
2.
Weak Emergence: Novelty as Unpredictability
Weak emergence is the kind
of emergence that is common in the early twenty-first century primarily (though
not exclusively) in cognitive science, complex system theory and, generally,
scientific discussions of emergence in which the notions of complexity,
functional organization, self-organization and non-linearity are central. The
core of this position is that a property is emergent if it is a systemic
property of a system—a property of a system that none if its smaller parts
share—and it is unpredictable or unexpected given the properties and the laws governing
the lower-level, more fundamental, domain from which it emerged. Since weak
emergence is defined in terms of unpredictability or unexpectedness, it is an
epistemological rather than a metaphysical notion. Commonly cited examples of
such weak emergent phenomena range from emergent patterns in cellular automata
and systemic properties of connectionist networks to phase transitions, termite
organization, traffic jams, the flocking patterns of birds, and so on.
Weak emergence is compatible
with reduction since a phenomenon may be unpredictable yet also reducible. For
instance, processes comprised of many parts may fall under strict deterministic
laws yet be unpredictable due to the unforeseeable consequences of minute
initial conditions. And, as Chalmers (2006) argues, weak emergence is also
compatible with deducibility of the emergent phenomenon from its base, as for
instance, in cellular automata in which though higher-level patterns may be
unexpected they are in principle deducible given the initial state of the base
entities and the basic rules governing the lower level.
Mario Bunge’s “rational
emergentism” (1977) is a form of weak emergence according to which emergent
properties are identified with systemic properties that none of the parts of
the system share and that are reducible to the parts of the system and their
organization. Bunge identifies his view as an emergentism of sorts because he
claims that, unlike reductionist mechanism it appreciates the novelty of
systemic properties. In addition, he thinks of novelty as having a reductive
explanation. He calls this “rational” emergence.
William Wimsatt (2000) also
defends an account according to which emergence is compatible with reduction.
Wimsatt defines emergence negatively as the failure of aggregativity;
aggregativity is the state in which “the whole is nothing more than the sum of
its parts” in which, that is, systemic properties are the result of the
component parts of a system rather than their organization. Contrasting
emergence to aggregativity, Wimsatt defines a systemic property as emergent
relative to the properties of the parts of a system if the property is
dependent on their mode of organization (and is also context-sensitive) rather
than solely on the system’s composition. He argues that, in fact, it is
aggregativity which is very rare in nature, while emergence is a common
phenomenon (even if in different degrees).
Robert Batterman (2002), who
focuses on emergence in physics, also believes that emergent phenomena are
common in our everyday experience of the physical world. According to
Batterman, what is at the heart of the question of emergence is not downward
causation or the distinctness of emergent properties, but rather
inter-theoretic reduction and, specifically, the limits of the explanatory
power of reducing theories. Thus, a property is emergent, according to this
view, if it is a property of a complex system at limit values that cannot be
derived from lower level, more fundamental theories. As examples of emergent
phenomena Batterman cites phase transitions and transitions of magnetic
materials from ferromagnetic states to paramagnetic states, phenomena in which
novel behavior is exhibited that cannot be reductively explained by the more
fundamental theories of statistical mechanics. However, Batterman wants to
distinguish explanation from reduction and so claims that though emergent
phenomena are irreducible they are not unexplainable per se because they can
have non-reductive explanations.
More recently Mark Bedau
(1997, 2007, 2008) has argued that the characteristic of weak emergence is
that, though macro-phenomena of complex systems are in principle ontologically
and causally reducible to micro-phenomena, their reductive explanation is
intractably complex, save by derivation through simulation of the system’s
microdynamics and external conditions. In other words, though macro-phenomena
are explainable in principle in terms of micro-phenomena, these explanations
are incompressible, in the sense that they can only be had by “crawling the
micro-causal web”—by aggregating and iterating all local micro-interactions
over time. Bedau argues that this is the only kind of real emergence and
champions what he calls the “radical view” of emergence according to which
emergence is a common phenomenon that applies to all novel macro-properties of
systems. (He contrasts this to what he calls the “sparse view” which he
characterizes as the view that emergence is a rare phenomenon found only in
“exotic” phenomena such as consciousness that are beyond the scope of normal
science.) However, though this is a weak kind of emergence in that it denies
any strong form of downward causation and it involves reducibility of the macro
to the micro (even if only in principle), Bedau denies that weak emergence is
merely epistemological, or merely “in the mind” since explanations of weak
emergent phenomena are incompressible because they reflect the incompressible
nature of the micro-causal structure of reality which is an objective feature
of complex systems.
Andy Clark (1997, 2001) also
holds a weak emergentist view according to which emergent phenomena need not be
restricted to unpredictable or unexplainable phenomena but are, instead,
systemic phenomena of complex dynamical systems that are the products of
collective activity. Clark distinguishes four kinds of emergence. First,
emergence as collective self-organization (a system becomes more organized due
solely to the collective effects of the local interaction of its parts, such as flocking patterns of birds, or due to the
collective effects of its parts and the environment, such as termite nest
building). Second, emergence as unprogrammed functionality, that is, emergent
behavior that arises from repeated interaction of an agent with the environment,
such as wall-following behavior in “veer and bounce” robots (Clark, 1997).
Third, emergence as interactive complexity in which effects, patterns or
capacities of a system emerge resulting from complex, cyclic interaction of its
components. For example, Bénard and Couette convection cells that result from a
repetitive cycle of movement caused by differences in density within a fluid
body in which the colder fluid forces the warmer fluid to rise until the latter
loses enough heat to descend and cause the former fluid to rise again, and so
on. And fourth, emergence as uncompressible unfolding (phenomena that cannot be
predicted without simulation). All of these formulations of emergence are
compatible with reducibility or in principle predictability and are thus forms
of weak emergence. For Clark, emergence picks out the “distinctive way” in
which factors conspire to bring about a property, event or pattern and it is
“linked to the notion of what variables figure in a good explanation of the
behavior of a system.” Thus, Clark’s notion of emergence in complex systems
theory is explanatory in that it focuses on explanations in terms of collective
variables, that is, variables that focus on higher-level features of complex
dynamical systems that do not track properties of the components of the system
but, instead, reflect the result of the interaction of multiple agents or their
interaction with their environment.
Proponents of weak emergence
do not support the strong notion of downward causation that is found in strong
emergentist views but, instead, favor one in which higher-level causal powers
of a whole can be explained by rules of interaction of its parts, such as
feedback loops. Though this kind of view of emergence is predominant in the
sciences, it is not exclusive to them. A form of weak emergence within
philosophy that denies strong downward causation can be found in John Searle
(1992). Searle allows for the existence of “causally emergent system features”
such as liquidity, transparency and consciousness that are systemic features of
a system that cannot be deduced or predicted from knowledge of causal
interactions of lower levels. However, according to Searle, whatever causal
effects such features exhibit can be explained by the causal relations of the systems
parts, for example, in the case of consciousness, by the behavior and
interaction of neurons.
If we make use, for more
precision, of the distinction between ontological and explanatory reduction we
can see that if we understand strongly emergent phenomena as both ontologically
and explanatorily irreducible, as Crane (2010) does, then they are also weakly
emergent. However, if strongly emergent phenomena are only ontologically
irreducible they may still be, in principle, predictable. For example, even if
you deny the identity of heat with mean kinetic energy (perhaps because of
multiple realizability) a Laplacean demon could still predict a gas’ heat from
the mean kinetic energy of its molecules with the use of “bridge laws” that
link the two vocabularies. These bridge laws can be considered to be part of
what Crane calls an explanatory reduction. So in such cases, strong emergence
does not entail weak emergence. Also it should be noted that weak emergence
does not entail strong emergence. A phenomenon can be unpredictable yet also
ontologically reducible: perhaps for instance, because systemic properties are
subject to indeterministic laws. So a case of weak emergence need not
necessarily be a case of strong emergence.
ii.
Synchronic and Diachronic Emergence
Another distinction that is
made concerning how novelty is understood is the distinction between synchronic
and diachronic novelty. The former is novelty exhibited in the properties of a
system vis-à-vis the properties of its constituent parts at a particular time;
the latter is temporal novelty in the sense that a property or state is novel
if it is instantiated for the first time. This distinction leads to distinction
between synchronic and diachronic emergence.
In synchronic emergence,
articulated by C. D. Broad and predominant in the philosophy of mind, the
higher-level, emergent phenomena are simultaneously present with the
lower-level phenomena from which they emerge. Usually this form of emergence is
stated in terms of supervenience of mental phenomena on subvenient/subjacent
neural structures, and so mental states or properties co-exist with states or
properties at the neural level. Strong ontological emergence is thus usually
understood to be synchronic, “vertical”, emergence. In contrast, diachronic
emergence is “horizontal” emergence evolved through time in which the structure
from which the novel property emerges exists prior to the emergent. This is
typical of the weakly emergent states appealed to in discussions of complex
systems, evolution, cosmology, artificial life, and so forth. It can be found
in Searle (1992) since he views the relation of the emergent to its base as
causal thus, at least in non-synchronic accounts of causation, excluding
synchronic emergence.
Because diachronic emergence
is emergence over time, novelty is understood in terms of unpredictability of
states or properties of a system from past states of that system. And because
weak emergence is typically defined in terms of unpredictability it is also
usually identified with cases of diachronic emergence. In contrast, in
synchronic emergence, which refers to the state of a system at a particular
time, novelty revolves around the idea of irreducibility and thus synchronic
emergence is usually identified with strong emergence. However, there are
formulations of non-supervenience-based strong emergence that are causal and
diachronic, such as O’Connor and Wong’s (2005). Note that synchronic emergence
could be the result of diachronic emergence but is not entailed by it since,
presumably, if God were to create the world exactly as it is in this moment,
synchronically emergent phenomena would exist without them being diachronically
emergent.
b.
Emergence and Supervenience
The British emergentists,
and this is especially clear in the writing of C. D. Broad, thought that a
necessary feature of emergentism is a relation of the kind we would today call
supervenience. Supervenience is a relation of covariation between two sets of
properties, subjacent/underlying properties and supervenient properties.
Roughly, we say that a set of properties A supervenes on a set of properties B
if and only if two things that differ with respect to A-properties will also
differ with respect to B-properties. Today, because of the failure of
successful reductions, especially in the case of the mental to the physical,
and because the relation of supervenience per se doesn’t entail anything about
the specific nature of the properties it relates, for example, whether they are
distinct or not, it has been seen as a prima facie good candidate for a key
feature of the relation between emergents and their subjacent base that can
account for the distinctness and dependence of emergents while also adding the
restriction of synchronicity. Jaegwon Kim (1999), James van Cleve (1990),
Timothy O’Connor (1994), Brian McLaughlin (1997), David Chalmers (2006) and Paul
Noordhof (2010) all take nomological strong supervenience to be a necessary
feature of emergentism. (For present purposes, following Kim we can define
strong supervenience thus: A-properties strongly supervene on B-properties if
and only if for any possible worlds w1 and w2 and any individuals x in w1 and y
in w2, if x in w1 is B-indiscernible from y in w2, then x in w1 is
A-indiscernible from y in w2. Nomological supervenience restricts the range of
possible worlds to those that conform to the natural laws).
However, not everyone agrees
that the relation of strong supervenience is necessary for strong emergence.
Some, like Crane (2001), argue that supervenience is not sufficient for
emergence and other proponents of strong emergence have questioned that supervenience
is even a necessary condition for emergence. For example, O’Connor (2000, 2003,
O’Connor & Wong 2005) now supports a form of dynamical emergence which is
causal and non-synchronic. A state of an entity is emergent, in this view, if
it instantiates non-structural properties as a causal result of that object’s
achieving a complex configuration. O’Connor’s view includes a strong notion of
downward causation (and the denial of causal closure–roughly, the principle
that all physical effects are entirely determined by, or have their chances
entirely determined by, prior physical events) and the possibility that an
emergent state can generate another emergent state.
Paul Humphreys (1996, 1997)
has also offered an alternative account to supervenience-based emergence
according to which emergence of properties is the diachronic result of fusion
of lower-level properties, a phenomenon that Humphreys claims is common in the
physical realm. That is, properties of the base are fused (thereby ceasing to
exist) and give rise to new emergent properties with novel causal powers which
are not made up of the old property instances—and, in this sense, the only real
phenomenon is the emergent phenomenon. Humphreys offers as a paradigmatic
example of such emergence quantum entanglement, in which a system can be in a
definite state while its individual parts are not and in which the state of the
system determines the states of its parts and not the other way around. It must
be noted that Humphreys claims ignorance about whether this is what happens in
the case of mental properties. Different formulations of
non-supervenience-based emergence can be found in Silberstein and McGeever
(1999) who have also argued for ontological emergence in quantum mechanics and,
by extension, as a real feature of the natural world, as well as in Bickhard
and Campbell’s (2000) “process model” of ontological emergence.
3.
Objections to Emergentism
a.
The Supervenience Argument
The most usually cited
objection to strong emergence, initially formulated by Pepper (1926) and
championed today by Jaegwon Kim (1999, 2005), concerns the novel (and downward)
causal powers of emergent properties.
Kim’s formulation is based
on three basic physicalist assumptions: (1) the principle of causal closure
which Kim defines as the principle that if a physical event has a cause at t,
then it has a physical cause at t, (2) the principle of causal exclusion
according to which if an event e has a sufficient cause c at t, no event at t
distinct from c can be the cause of e (unless this is a genuine case of causal
over-determination), and (3) supervenience. Kim defines mind/body supervenience
as follows: mental properties strongly supervene on physical/biological
properties, that is, if any system s instantiates a mental property M at t,
there necessarily exists a physical property P such that s instantiates P at t,
and necessarily anything instantiating P at any time instantiates M at any
time.
The gist of the problem is
the following. In order for emergent mental properties to have causal powers
(and thus to exist, according to what Kim has coined “Alexander’s dictum”)
there must be some form of mental causation. However, if this is the case, the
principle of causal closure is violated and emergence is in danger of becoming
an incoherent position. If mental (and therefore downward) causation is denied
and thus causal closure retained, emergent properties become merely
epiphenomenal and in this case their existence is threatened.
More specifically, the
argument is as follows. According to mind-body supervenience, every time a
mental property M is instantiated it supervenes on a physical property P. Now
suppose M appears to cause another mental property M¹, the question arises
whether the cause of M¹ is indeed M or whether it is M¹’s subvenient/subjacent
base P¹ (since according to supervenience M¹ is instantiated by a physical
property P¹). Given causal exclusion, it cannot be both, and so, given the
supervenience relation, it seems that M¹ occurs because P¹ occurred. Therefore,
Kim argues, it seems that M actually causes M¹ by causing the subjacent P¹ and
that mental to mental (same level) causation presupposes mental to physical
(downward) causation. [Another, more direct, way to put this problem is whether
the effect of M is really M¹ or M¹’s subjacent base P¹. I chose an alternative
formulation in order for the problem to be more clear to the non-expert
reader.] However, Kim continues, given causal closure, P¹ must have a
sufficient physical cause P. But given exclusion again, P¹ cannot have two
sufficient causes, M and P, and so P is the real cause of P¹ because, if M were
the real cause then causal closure would be violated again. Therefore, given
supervenience, causal closure and causal exclusion, mental properties are
merely epiphenomenal. The tension here for the emergentist, the objection goes,
is in the double requirement of supervenience and downward causation in that,
on the one hand, we have upward determination and the principle of causal
closure of the physical domain, and, on the other hand, we have causally
efficacious emergent phenomena. In other words, Kim claims that what seem to be
cases of emergent causation are just epiphenomena because ultimately the only
way to instantiate an emergent property is to instantiate its base. So, saying
that higher level properties are causally efficacious renders any form of
non-reductive physicalism, under which Kim includes emergentism, at least
implausible and at most incoherent.
Note that this is an
objection leveled against cases of strong emergence because in cases of weak
emergence that do not make any claims of ontological novelty the causal
inheritance principle is preserved—the emergents’ causal powers are inherited
from the powers of their constitutive parts. For example, a flocking pattern of
birds may affect the movement of the individual birds in it but that is nothing
more than the effect of the aggregate of all the birds that make it up. Also,
this argument applies to cases of supervenience-based emergence which retain
base properties intact along with emergent properties, but accounts of
emergence that are non-synchronic sidestep the problem of downward causation.
So, Kim’s objection does not get off the ground as a retort to O’Connor’s
dynamical emergence, Bickhard and Campbell’s process model, Silberstein and
McGeever’s quantum mechanical emergence or Humphreys’ fusion emergence.
In the cases where this
objection applies, there have been different responses. Philosophers who want to retain causal
closure while also retaining emergent properties have tried to give modified
accounts of strong emergence that deny either downward causation or the
requirement that emergent properties have novel causal powers. For example,
Shoemaker (2001) believes that what must be denied is not the principle of
causal closure but, instead, that emergent properties have novel causal powers
(the appearance of which he elsewhere attributes to “micro-latent” powers of
lower-level entities). This approach, however, is problematic, since it seems
to be a requirement for robust strong emergence that emergent properties are
not merely epiphenomenal. Another approach has recently been proposed by
Cynthia and Graham Macdonald (2010) who attempt to preserve causal closure and
to show that it is compatible with emergence by building a metaphysics in which
events can co-instantiate in a single instance mental and physical properties
thus allowing for mental properties to have causal effects (a view that Peter
Wyss (2010) has correctly pointed out is in some respects reminiscent of Samuel
Alexander’s). In this schema, the Macdonalds argue, property instances do not
belong to different levels (though properties do) and so the problem of
downward causation is resolved because, in effect, there is no downward
causation in the sense assumed by Kim’s argument (and causal efficacy for
emergent and mental properties is preserved, they argue, since if a property
has causally efficacious instances that means that the property itself has
causal powers). However this view will also seem unsatisfactory to the strong
emergentist who wants to retain a robust notion of emergent properties and
downward causation.
Other philosophers who want
to retain strong emergence have opted for rejecting causal closure
instead. Such a line has been taken by
Crane (2001), Hendry (2010) and Lowe (2000) who, however, subsequently offers
an account of strong emergence compatible with causal closure (Lowe, 2003).
b.
Do Cases of Genuine (Strong) Emergence Exist?
Kim’s supervenience argument
is meant to question the very possibility of strongly emergent properties.
However, even if strong emergence is possible, there is the further question of
whether there are any actual cases of strong emergence in the world.
Brian McLaughlin (1992) who
grants that the emergence of novel configurational forces is compatible with
the laws of physics and that theories of emergence are coherent and consistent,
has argued that there is “not a scintilla of evidence” that there are any real
cases of strong emergence to be found in the world. This is a commonly cited
objection to emergence readily espoused by reductive physicalists committed to
the purely physical nature of all the phenomena that have at different times
been called emergent and also raised by Mark Bedau who claims that though weak
emergence is very common we have no evidence for cases of strong emergence.
Hempel and Oppenheim (1948)
have argued that the unpredictability of emergent phenomena is
theory-relative—that is, something is emergent only given the knowledge
available at a given time—and does not reflect an ontological distinction. And
Ernest Nagel (1960), agreeing that emergence is theory-relative, argued that it
is a doctrine concerning “logical facts about formal relations between
statements rather than any experimental or even ‘metaphysical’ facts about some
allegedly ‘inherent’ traits of properties of objects.” According to these
views, theoretical advance and accumulation of new knowledge will lead to the
re-classification of what are today considered to be emergent phenomena, as
happened with the case of life and chemical bonding of the British
emergentists. However, though these objections can be construed as viable
objections to some forms of weak emergence they fail to affect strong emergence
(which was their target) because it is concerned with in principle
unpredictability as a result of irreducibility.
Though this skepticism is
shared by a few, some philosophers believe that though strong emergence may be
rare, it does exist. Bickhard and Campbell (2000), Silvester and McGeever
(1999) and Humphreys (1997) claim that ontological emergence can be found (at
least) in quantum mechanics—an interesting proposal, and somewhat ironic given
that it was advances in quantum physics in the early 20th century that was
supposed to have struck the death blow to the British emergentist tradition.
Predominantly, however, the usual candidates for strongly emergent properties
are mental properties (phenomenal and/or intentional) that continue to resist
any kind of reduction. Chalmers (2006)—because of the explanatory gap—considers
consciousness to be the only possible intrinsically strongly emergent
phenomenon in nature while O’Connor (2000) has argued that our experience of
free will which is, in effect, macroscopic control of behavior, seems to be
irreducible and hence strongly suggests that human agency may be strongly
emergent. (Stephan (2010) also sees free will as a candidate for a strongly
emergent property.)
Another line of response is
taken by E. J. Lowe (2000) according to whom emergent mental causes could be in
principle out of reach of the physiologist, and so it should not come as a
surprise that physical science has not discovered them. Lowe argues that, even
if we grant that every physical event has a sufficient immediate physical
cause, it is plausible that a mental event could have caused the physical event
to have that physical cause. That is not to say that the mental event caused
the physical event that caused the physical effect; rather, the mental event
linked the two physical events so the effect was jointly caused by a mental and
a physical event. Such a case, Lowe argues, would be indistinguishable from the
point of view of physiological science from a case in which causal closure
held.
Following this line of
thought it can be argued that though we do not have actual empirical proof that
emergent properties exist, the right attitude to hold is to be open to the
possibility of their existence. That is, given that there is no available
physiological account of how mental states can cause physical states (or how
they can be identical), while at the same time having everyday evidence that
they do, as well as a plausible mental—psychological or folk
psychological—explanation for it, we have independent grounds to believe that
emergent properties could possibly exist.
No comments:
Post a Comment