Introduction

Quantum Mechanics
(2002)
Neuroscience 
and the Person
 (1999)
Evolutionary and 
Molecular Biology
 (1998)
Chaos and Complexity
(1996)
Quantum Cosmology 
and the Laws of Nature
 (1993)

The CTNS-Vatican Observatory book series is the fruit of a multi-year collaborative research project between the two institutions. The project and the accompanying book series focus on the theological concept of divine action in relation to contemporary scientific theories. Featuring an international team of scholars including physicists, biologists, neuroscientists, philosophers, and theologians, this series includes six co-published volumes: five on divine action and an inaugural volume, Physics, Philsophy, and Theology: A Common Quest for Understanding (1988).

As Robert John Russell, the General Editor of the Series, writes in his introduction to Quantum Cosmology and the Laws of Nature:

“The overarching goal of these conferences is twofold: to contribute to constructive theological research as it engages current research in the natural sciences and to investigate the philosophical and theological elements in ongoing theoretical research in the natural sciences...A major issue in the debate over theology and science regards the role science ought to play. Too often science tends to set the agenda for the theological discussion with little if any initiative taken by theology. From the beginning it was the clear intention of the steering committee that our research should expand beyond this format to insure a ‘two-way interaction’ between the scientific and theological research programs. To achieve this goal...we looked for an overarching theological topic to thematize all these conferences...The topic of God’s action in the world was quickly singled out as a promising candidate, since it seems to permeate the discussions of theology and science in both philosophical and systematic contexts and it allows for a variety of particular issues to be pursued.”

Chaos and Complexity

This collection of fifteen research papers explores the implications of chaos and complexity in physical, chemical, and biological systems for philosophical and theological issues regarding God's action in the world. After an introduction to chaos and complexity, these essays respond to a series of questions: do these topics in the natural sciences lend support for a philosophy of nature based on metaphysical determinism and indeterminism? In what ways do they shed light on the problem of general and special providence, and in particular on a non-interventionist understanding of objectively special divine action? Are there other areas of science which illuminate these questions more adequately than do chaos and complexity?

Crutchfield, James P., J. Doyne Farmer, Norman H. Packard, and Robert S. Shaw. “Chaos.”

This previously published paper by James P. Crutchfield, J. Doyne Farmer, Norman H. Packard, and Robert S. Shaw is reprinted here to give a broad introduction and background to the science of chaos and complexity.

Until recently scientists assumed that natural phenomena such as the weather or the roll of the dice could in principle be predictable given sufficient information about them. Now we know that this is impossible. “Simple deterministic systems with only a few elements can generate random behavior.” Though the future is determined by the past, small uncertainties are amplified so radically by chaotic systems that, in practice, their behavior rapidly becomes unpredictable. Still there is “order in chaos,” since elegant geometrical forms underlie and generate chaotic behavior. The result is “a new paradigm in scientific modeling” which both limits predictability in a fundamental way and yet extends the domain in which nature can be at least partially predictable.

The article acknowledges the challenge to Laplacian determinism posed by quantum mechanics for subatomic phenomena like radioactive decay, but stresses that large-scale chaotic behavior, which focuses instead on macroscopic phenomena like the trajectory of a baseball or the flow of water, “has nothing to do with quantum mechanics.” In fact many chaotic systems display both predictable and unpredictable behavior, like fluid motion which can be laminar or turbulent, even though they are governed by the same equations of motion. As early as 1903, Henri Poincaré suggested that the explanation lay in the exponential amplification of small perturbations.

Chaos is an example of a broad class of phenomena called dynamical systems. Such systems can be described in terms of their state, including all relevant information about them at a particular time, and an equation, or dynamic, that governs the evolution of the state in time. The motion of the state, in turn, can be represented by a moving point following an orbit in what is called state space. The orbits of non-chaotic systems are simple curves in state space. For example, the orbit of a simple pendulum in state space is a spiral ending at a point when the pendulum comes to rest. A pendulum clock describes a cyclic, or periodic, orbit, as does the human heart. Other systems move on the surface of a torus in state space. Each of these structures characterizing the long-term behavior of the system in state space - the point, the cycle, the torus - is called an attractor since the system, if nudged, will tend to return to this structure as it continues to move in time. Such systems are said to be predictable.

In 1963, Edward Lorenz of MIT discovered a chaotic system in meteorology which showed exponential spreading of its previously nearby orbits in state space. The spreading effect is due to the fact that the surface on which its orbits lie is folded in state space. Such a surface is called a strange attractor, and it has proven, in fact, to be a fractal. The shape of a strange attractor resembles dough as it is mixed, stretched, and folded by a baker. With this discovery we see that “random behavior comes from more than just the amplification of errors and the loss of the ability to predict; it is due to the complex orbits generated by stretching and folding.”

The essay closes with some profound questions about scientific method. If predictability is limited in chaotic systems, how can the theory describing them be verified? Clearly this will involve “relying on statistical and geometric properties rather than on detailed prediction.” What about the assumption of reductionism in simple physical systems? Chaotic systems display a level of behavioral complexity which frequently cannot be deduced from a knowledge of the behavior of their parts. Finally, the amplification of small fluctuations may be one way in which nature gains “access to novelty” and may be related to our experience of consciousness and free will.

Drees, Willem B. “Gaps for God?”

Willem B. Drees argues that theories of chaotic and complex systems have made it clearer than ever before that a naturalistic explanation of the world is possible, even in light of the lack of predictability of these systems. These theories have effectively closed certain gaps in our understanding of nature. He is therefore critical of John Polkinghorne’s suggestion that the unpredictability of natural processes provides a potential locus for divine action. Polkinghorne suggests that God brings about an input of information into the world without an input of energy. Drees claims that this is inconsistent with quantum physics and thermodynamics. In addition, Polkinghorne seems to interpret the unpredictability of chaotic systems as a sign of intrinsic openness, but this ignores the real meaning of deterministic chaos. Moreover, discarding the theory of deterministic chaos would be inconsistent with the very critical realism that Polkinghorne promotes.

However, denying any such gaps within natural processes need not foreclose all options for a religious view of reality. In fact Drees claims that science raises religious questions about nature as a whole and about the most fundamental structures of reality. To make his case, he distinguishes between two conceptions of explanation in contemporary philosophy of science. Ontic views of explanation consider an event explained if it is understood as a possible consequence of a causal mechanism. Epistemic views of explanation consider phenomena and laws explained if they are seen as part of a wider framework. Hence if one adopts an epistemic view of explanation the framework itself still requires an explanation. Along these lines, Arthur Peacocke and others have argued for divine action on the whole of natural reality: God could cause specific events in nature via “top-down” or “whole-part” causation. Drees, however, rejects the attempt to extrapolate from the context of nature as environment to the concept of God as the world’s environment.

Given the various problems with attempts to envisage God’s action in the world, Drees prefers to understand the world as God’s action. Whatever strength explanations have, there always remain limit questions about reality and about understanding which allow us to develop a religious interpretation of “secular naturalism.”

Edwards, Denis. “The Discovery of Chaos and the Retrieval of the Trinity.”

Denis Edwards begins by pointing to a major shift in science: the old worldview is giving way to a new paradigm of an open and self-organizing universe. Similarly, in systematic theology the old concept of God as the individual Subject is giving way to a relational, dynamic, trinitarian concept of God.

The first part of Edwards’ paper explores the general concept of divine action from the perspective of what many are calling a “retrieved” trinitarian theology. In the West, trinitarian theology as inherited from Augustine and Aquinas emphasized an individual and psychological model of the Trinity rather a communitarian one. It focused on divine unity rather than three persons, and on divine being rather than divine love. The newer trinitarian theology builds instead on the writings of Richard of St. Victor and Bonaventure. Edwards outlines a theology of divine action which understands the Trinity as a communion of mutual relationships which are dynamic, ecstatic, and fecund. He argues that the universe is God’s trinitarian self-expression, that there are “proper” roles for the trinitarian persons in creation, and that divine interaction with creation is characterized by the vulnerability and liberating power of love.

The second part of the paper asks what this trinitarian theology of divine action has to say about particular divine actions, such as the incarnation, the Holy Spirit, and divine providence. Edwards explores these questions by assessing the views of John Polkinghorne and Arthur Peacocke. He finds both significant agreements as well as some disagreements between them, particularly over the issue of whether the unpredictability of chaotic systems points towards an ontological indeterminism in nature.

Edwards’ reflections can be summarized in the form of six statements: (1) The trinitarian God works in and through the processes of the universe, through laws and boundary conditions, through regularities and chance, through chaotic systems and the capacity for self-organization. (2) This trinitarian God allows for, respects, and is responsive to, the freedom of human persons and the contingency of natural processes, but is not necessarily to be denied a knowledge of future contingent events. (3) We must take into account not only the divine action of continuous creation, but also particular or special divine acts. (4) If God is acting creatively and responsively at all times and also in particular ways, then this seems to demand action at the level of the whole system as well as at the everyday level of events, and at the quantum level. (5) Particular divine acts are always experienced as mediated through created realities. (6) The unpredictability, openness, and flexibility discovered by contemporary science is significant for talk of particular divine action because it provides the basis for a worldview in which divine action and scientific explanation are understood as mutually compatible, but it is not possible or appropriate to attempt to identify the “causal joint” between divine action and created causality.

Ellis, George F. R. “Ordinary and Extraordinary Divine Action: The Nexus of Interaction.”

In “Ordinary and Extraordinary Divine Action,” George Ellis intends to elaborate the conclusions reached by Tracy, Murphy, and others concerning the role of quantum indeterminacy in a contemporary understanding of divine action. He claims that some account of special divine action is necessary if the Christian tradition is to make sense. However, there are two important constraints to be reckoned with. One is that an ideal account of divine action must not conflict with a scientific understanding of nature; the other is that some explanation must be given of why a God capable of special action would not exercise that ability regularly to oppose evil and ameliorate suffering.

Ellis’ analysis focuses on the nature of bottom-up and top-down causation in hierarchical systems. It is predicated upon the assumption that chaotic dynamics does not provide the required openness in physical systems. Furthermore, his analysis of top-down causation convinces him that this concept alone does not provide for an adequate account of divine action. He distinguishes between generic top- down causation, in which boundary conditions produce a global effect upon all the entities in a system, and specific top-down causation, which involves local interactions with elements of the lower-level system. Special divine actions would seem to entail the latter. However, specific top-down causation seems to require, in turn, that there be an intrinsic openness or indeterminacy at the very lowest level of the hierarchy of complexity. Thus, a study of the possibilities for divine action via top-down causation leads inevitably to a consideration of divine action at the quantum level.

Ellis takes God’s action to be largely through the ordinary created processes. God initiates the laws of physics, establishes the initial conditions for the universe, and sustains the universe and its processes, which in turn result in the emergence of higher levels of order, including, finally, free human beings. Special divine action focuses on providing to human beings intimations of God’s will for their social lives. Thus, the problem of the mode of divine action is largely a question of how God might communicate directly with those who are open to revelation. Ellis speculates that quantum events in the brain (directed by God) might be amplified to produce revelatory thoughts, images, and emotions. If it is supposed that God has adequate reason to restrict divine action to a combination of ordinary action (in and through natural processes) and revelation (such as the Resurrection of Christ) then the problem of evil does not take on the same dimensions as it does when it is assumed that God might freely intervene in any sort of process at any time.

Finally, Ellis addresses the question of support for his view. He claims that while individual moves made in the paper (such as the focus on divine action at the quantum level) may not appear to be justified, the combined constraints imposed by the need to make sense of the Christian tradition and by science actually limit the possible acceptable positions quite severely; thus, the view herein presented is, in Ellis’ opinion, highly credible relative to the broad range of data.

Gilkey, Langdon. “The God of Nature.”

Langdon Gilkey’s paper considers two questions: whether nature’s processes suggest the existence of a God, and if so, what sort of God. However, he emphasizes that the traces of God that may be found in nature are not the main source of religious belief; for Christians, God is encountered primarily in history.

Science, including chaos theory, provides a picture of reality that combines both order and novelty; the ascending order can well be described as an order of increasing value. Reflection on this scientific picture of reality, along with the wider data of human experience (history) leads to ontology or metaphysics - the effort to understand the structure of being qua being. This level of reflection is crucial for both the scientist and the theologian. For the scientist it provides the rational grounding for science itself. Metaphysics is crucial for the theologian, since “proofs” of any sort for the existence of God are always conditional upon a particular metaphysical structuring of experience.

Consequently, the aspects of nature suggested by the sciences must be represented in ontological categories. Beyond and through the abstractions of the scientific understanding of nature, nature’s reality has manifested itself as power, as spontaneity or life, as order, and as implying a redemptive principle, a strange dialectic of sacrifice, purgation, redemption, and rebirth. In nature each of these appear as vulnerable and ambiguous as well as creative. Each of these characteristics, therefore, raises a “limit question,” and thus represents a trace of God. For example, what is the deeper, more permanent power that makes possible the transitory appearance of power in nature? ‘God’ is the name for that ever-continuing source of power. To know God truly is to know God’s presence in the power, life, order, and redemptive unity of nature; to know nature truly is to know its mystery, depth, and ultimate value - to know it as an image of the sacred.

Happel, Stephen. “Divine Providence and Instrumentality: Metaphors of Time in Self-Organizing Systems and Divine Action.”

According to Stephen Happel, Christian theology in the thought of Thomas Aquinas has a coherent understanding of the interaction between God and creation. By developing a clear theory of transcendence and of universal instrumentality, Aquinas was able to articulate the basic ways in which inanimate, animate, and human secondary causes cooperate or conflict with the divine act of love for the universe (i.e., providence). These terms can be transposed into an historical ontology and a language of mutual mediation such that all levels of reality have their relative autonomy. Contemporary science, with its analysis of self-organizing systems, provides an understanding of the regularities and contingencies of inanimate and animate created realities. Its language permits us to understand how an open, flexible universe can provide the conditions for cooperation with one another and with divine action without conflict or violence to the integrity of creation.

Happel’s analysis is basically optimistic. It is born of a religious conviction that though the cosmos (whether human or non-human) is flawed and finite, its internal logic is not vitiated, malicious, or deceptive. Images, the body and the non-verbal are no more (and no less) prone to sin than reason. Within the temporal being of “nature,” self-organizing, living, self-conscious beings can engage with their environments in a cooperative way. Ultimately, Happel argues that self-conscious creatures may learn that cooperating with the ultimate environment, an unfathomable Other, will not do violence to their own complex teleonomies.

The Christian claim, however, goes further. It maintains that this mysterious enveloping environment is involved in a mutual self-mediation with creation. When one is in love, one mediates oneself in and through an other who is discovering, planning, negotiating his or her personal identity in and through oneself. That is mutual self-mediation. Christians claim that they are not merely projecting themselves abstractly into an alien environment to mediate themselves, but that the Other has chosen out of love to mediate the divine subjectivity in and through natural self-organization (because God is ultimately a community of mutual self-mediation). The story of the Christ could have been quite different than it was. Jesus could have mediated himself in some other fashion, but he did not. He chose to offer his life for others in self-sacrificing generosity. In this action, he operated as though neither the natural nor the human environment nor God were an enemy. In loving creation, entrusting his own life to others, even in death, faith claims that there is here a divine love. This is what Happel has called elsewhere the “double dative of presence.” We are present to the divine who in that same movement is present to us. What we discover in this fragile and stumbling process of mediating ourselves and our world is an antecedent lover and friend.

Heller, Michael. “Chaos, Probability, and the Comprehensibility of the World.”

“The eternal mystery of the world is its comprehensibility.” This is, of course, Albert Einstein’s famous claim, and it serves as the point of departure for Michael Heller’s paper. According to Heller, this mystery is present in our prescientific cognition, but it reveals itself in full light only when one contemplates what Eugene Wigner called “the unreasonable effectiveness of mathematics in the natural sciences.” It is not a priori self-evident that the world should be “algorithmically compressible,” that is, that many of its phenomena should be captured by a few mathematical formulae.

There have been attempts to neutralize this wonder by reducing all regularities in the universe to the blind game of chance and probability. Heller briefly reviews two such attempts: the so-called chaotic gauge program and André Linde’s chaotic inflation in cosmology. If complete anarchy is the only law of nature (“laws from no laws”), then the fundamental rationality of the world is lost. The problem is important from a theological point of view. At the most fundamental level, God’s action in the world consists in giving the world its existence and giving it in such a way that everything that exists participates in its rationality, that is, is subject to the mathematically expressible laws of nature. If the ideology of the “pure game of chance and probability” turns out to be correct, then God’s action seems to be in jeopardy.

Heller responds by arguing that such attempts to neutralize the “mystery of comprehensibility” lead us even deeper into the problem. Probability calculus is as much a mathematical theory as any other, and even if chance and probability lie at the core of everything, the important philosophical and theological problem remains of why the world is probabilistically comprehensible. The probabilistic compressibility of the world is a special instance of its mathematical compressibility. Heller clarifies this point by reminding us that there are two kinds of elements (in the Greek sense of this word) in the universe - the cosmic elements, such as integrability, analyticity, calculability, predictability; and the chaotic elements, such as probability, randomness, unpredictability, and various stochastic properties. The chaotic elements are in fact as “mathematical” as the cosmic ones. If the cosmic elements provoke the question of why the world is mathematical, the same is true of the chaotic elements. In this view, cosmos and chaos are not antagonistic forces but rather two components of the same Logos immanent in the structure of the universe.

Kuppers, Bernd-Olaf. “Understanding Complexity.”

According to the paper by Bernd-Olaf Küppers reprinted here, the reductionistic research program “is based on the central working hypothesis that all biological phenomena can be explained totally within the framework of physics and chemistry.” It assumes that there is no essential difference between non-living and living matter; life arises as a “quasi-continuous” transition requiring no additional epistemic principles other than those of physics and chemistry. Restrictions in our current understanding are merely the result of the complexity of the problem and its computability. Epistemic reductionism leads to ontological reductionism in which “life is nothing but a complex interplay of a large number of atoms and molecules.” Even consciousness must ultimately be reducible to physical laws.

To counter this program, some biologists and philosophers of science appeal to “emergence” and “downward causation,” claiming that genuinely novel properties and processes arise in highly complex phenomena. According to this view, physics is a necessary part of the explanation but it cannot provide a sufficient explanation on its own. Küppers summarizes the claims of emergence and downward causation, respectively, as follows: “(1) The whole is more than the sum of its parts. (2) The whole determines the behavior of its parts.”

Since these concepts seem “vague and mysterious” to scientists in physics and biology, Küppers focuses here on a general problem concerning the transition from the non-living to the living: can we adequately characterize the emergence of life in terms of the concept of complexity. Küppers thinks not, since non-living systems may themselves be extraordinarily complex. In addition, one may find evidence of emergence even within a field, such as within physics, and not just between fields.

In a similar way, those supporting intratheoretical reduction (e.g., reductionism within physics) frequently appeal to “bridge laws,” while defenders of emergence deny their availability and their fruitfulness. Arguments such as these also apply to the question of downward causation. In Küppers’ opinion, both emergence and downward causation are to be found within physics. Since no “non- physical principle” is involved, apparently, in the transition to life, Küppers concludes that “both (emergence and downward causation) must be thought of as characteristics of self-organizing matter that appear at all levels when matter unfolds its complexity by organizing itself.” Still, there are examples of biological systems, such as the DNA macromolecule, which are immensely more complex than complex physical systems. Do they point to a limitation in physical method or in the reductionistic research program, or will physics undergo a paradigm shift as it seeks to encompass these phenomena within its domain?

To understand these questions better, Küppers begins by distinguishing between laws and initial or boundary conditions in physical theory. His central claim is that “the complexity of a system or a phenomenon lies in the complexity . . . of its boundary conditions.” Following the analysis of Michael Polanyi, Küppers argues that in a human construction, such as a complex machine, the design, or boundary conditions, governs the physical processes but cannot be deduced from them. In this way a machine, by its structure and operation, is an emergent system, a whole which is “neither additive nor subtractive,” whose properties cannot be reduced to those of its components, and whose boundary conditions represent a form of downward causation. A similar case can be made for a living organism.

Now the question becomes, what determines the boundary conditions? For a machine, the answer is a blueprint. For the living organism, however, the “blueprint” lies in the organism’s genome which, in contrast to the machine, is an inherent part of the living system. Küppers then distinguishes complex from simple systems in terms of both their sensitivity to small changes in their boundary conditions and the uniqueness of these conditions, given all possible physically equivalent conditions.

The concept of boundary conditions thus becomes the key to understanding the paradigm shift that is occurring within physics regarding the problem of complex phenomena. This shift is not of the Kuhnian type, with its revolutionary change in the fundamental laws and the theoretical framework of a field. Instead it is an “internal shift of emphasis” within the given explanatory structure of the paradigm. As Küppers sees it, the shift of emphasis within the reductionistic research program consists in the move to regard the boundary conditions of complex phenomena as that which needs explanation. He calls this shift of emphasis the “paradigm of self-organization.” It entails a sequence of explanations, in which boundary conditions at one level (such as the boundary conditions of the DNA molecule) are ex plained by those of another level (such as the random molecular structures), which themselves need explanation. In effect, the nested structures found in living matter are reflected by the nested structures of the paradigm of self-organization.

Finally, Küppers points out that biological self-organization is only possible in the context of non-equilibrium physics. Still, though the existence of specific boundary conditions can be understood within the framework of physics, their detailed physical structure cannot be deduced from physics. “The fine structure of biological boundary conditions reflects the historical uniqueness of the underlying evolutionary process” and these, by definition, transcend the powers of natural law to describe.

Moltmann, Jurgen. “Reflections on Chaos and God’s Interaction with the World from a Trinitarian Perspective.”

In his paper, Jürgen Moltmann first describes five models of the God-world relation: (1) According to the Thomistic model, God is the causa prima of the world. God also acts through the causae secundae which serve as God’s instruments. (2) The interaction model postulates a degree of reciprocal influence between God and the world. This model can include the Thomistic model, but not vice versa. (3) The whole-part model, taken from biological systems theory, emphasizes that the whole is more than and different from its parts. In complex and chaotic systems this difference shows up in the form of top-down causality. The whole-part model is more inclusive than the previous models and sheds light on God’s indirect effect upon the world as a whole. (4) The model of life processes emphasizes the open character of biological systems. The present state of a living system is constituted by its fixed past and its open future or, more generally, by what can be called tradition and innovation. Here the world process is open to God as its transcendental future. (5) Finally, Moltmann considers two central theological models: creation and incarnation. Here God creates by a process of self-limitation (or tzitzum). The limitation on God’s omnipresence creates a habitation for the world; the limitation on God’s omniscience provides the world with an open future. God’s self-limitation allows God to be present within the world without destroying it. Moltmann believes this model is the most inclusive of the five.

Moltmann next offers three comments on how these models function in current theological discussions about chaotic, complex and evolutionary systems.

(1) He is critical of the interaction model, seeing it as a theistic model in which God is the absolute Subject who may intervene at will in nature. In the modern period it was replaced by two even more problematic models: deism and pantheism. In their place Moltmann commends to us a trinitarian model in which “God the Father creates through the Logos/Wisdom in the power of the Holy Spirit. . . . God not only transcends the world but is also immanent in the world.” According to this model God acts upon the world through God’s presence in and perichoresis with all things.

(2) Next Moltmann discusses eschatology, the new creation of all things. For Moltmann the future is not a state of completion but a process of continuing openness, in which all finite creatures will participate in God’s unending and open eternity even as God participates in their temporality. The openness of chaotic, complex, and evolutionary systems is suggestive of this vision, and seems inconsistent with a future conceived of as completed. “The future of the world can only be imagined as the openness of all finite life systems to the abundance of eternal life. In this way they can participate in the inexhaustible sources of life and in the divine creative ground of being.”

(3) Finally Moltmann asks whether the universe as a whole should be thought of as an open system. The growth of possibilities for such systems, their undetermined character, and their dependence on an influx of energy suggest that the universe itself might be open to energy. “In this case the world would be a ‘system open to God’ and God a ‘Being open to the world.’”

Murphy, Nancey. “Divine Action in the Natural Order: Buridan’s Ass and Schrödinger’s Cat.”

Nancey Murphy directs our attention away from chaos and complexity to the arena of quantum physics. In her paper, Murphy argues that the problem of divine action will be solved by nothing less than a revised metaphysical theory of the nature of matter and of natural causes. Her proposal is that we view the causal powers of created entities as inherently incomplete. No event occurs without divine participation but, apart from creation ex nihilo, God never acts except by means of cooperation with created agents. Her paper attempts to show how this account can be reconciled with contemporary science, focusing on divine action at the quantum level.

First Murphy proposes criteria, derived from both theology and science, which any satisfactory theory of divine action must meet. She claims that it must allow for objectively special divine acts, yet not undercut our scientific picture of the law-like regularity of many natural processes. Then she surveys changes in metaphysical views of matter and causation from ancient to modern philosophy. The historical survey is intended to put in question current metaphysical assumptions about the nature of matter and of natural causes, as a prelude to considering the consequences of recent developments in science for these metaphysical issues.

Murphy’s proposal is that any adequate account of divine action must include a “bottom-up” approach: if God is to be active in all events, then God must be involved in the most basic of natural events. Current science suggests that this most basic level is quantum phenomena. It is a bonus for theology that we find a measure of indeterminacy at this level, since it allows for an account of divine action wherein God has no need to overrule natural tendencies or processes. This cooperation rather than coercion is in keeping with God’s pattern of respecting the integrity of other higher-level creatures, especially human creatures.

Consequences of this proposal are spelled out regarding the character of natural laws and regarding God’s action at the macroscopic level. One of these consequences is that the “laws of nature” must be descriptive, rather than prescriptive; they represent our human perceptions of the regularity of God’s action. In the end, she replies to some of the objections that have been raised against theories of divine action based on quantum indeterminacy and explains how the essay’s proposal meets the criteria of adequacy set out in the beginning.

Peacocke, Arthur. “Chance and Law in Irreversible Thermodynamics, Theoretical Biology, and Theology.”

Arthur Peacocke’s topic in this reprint is the general relationship between chance and law in thermodynamics and biology, and its implications for belief in God as creator.

Chance may be the means for actualizing the possibilities of the world, but it need not be seen as a metaphysical principle opposed to order or undercutting the meaning of life. Chance actually has two quite distinct meanings: it can refer simply to our ignorance of the processes which underlie an event, or it can refer to unpredictable intersections of previously unrelated causal chains. Recently, the interplay between chance and law has come to be seen as crucial to the origin and development of life, particularly through the work of Jacques Monod in molecular biology and Ilya Prigogine in irreversible thermodynamics and theoretical biology. As Monod emphasizes, evolution depends on chance, in the sense of two independent causal chains, operating in living organisms: one is at the genetic level, including changes in the nucleotide bases of DNA; the other is at the level of the organism, including interactions between the organism expressing these changes and its environment. Chance also arises here in the sense that we cannot now (nor may we ever be able to) specify the mechanisms underlying genetic mutations.

Though agreeing with him this far, Peacocke challenges both Monod’s generalization of the role of chance from the context of evolution to include all of human culture, and his subsequent conclusion to the meaningless of life. Instead, Peacocke sees chance as the means by which all possibilities for the organization of matter are explored in nature.

Peacocke then turns to irreversible thermodynamics and theoretical biology. Thermodynamics is “the science of the possible” which prescribes how nature can behave. Classical thermodynamics, with its focus on systems in equilibrium, centers on the second law of increasing entropy in closed systems. Through the statistical thermodynamics of Boltzmann this came to be seen as increasing disorder or randomness in closed systems. How, then, do living organism maintain themselves in a high state of organization and a low state of entropy, given the second law? The answer, as Peacocke points out, is that living systems are open to their environment. By exchanging energy and matter with it they can decrease in entropy as long as there is an increase in the net environmental entropy.

But does thermodynamics help us to understand how more complex organisms come to be in the first place? The answer comes only with the extension of classical thermodynamics, first to linear, and then to non-linear, irreversible processes involved in what are called dissipative structures. According to Prigogine, if fluctuations in these non-linear, non-equilibrium structures are amplified, they can change the structures and result in new, more ordered states. The answer also includes the key role played by multiple and relatively stable strata in the hierarchy of biological complexity. These intermediate strata enhance the rate of evolution of more complex organisms from very simple ones, in effect directing evolution towards increased complexity. In essence, the evolution of chemical, pre-biological, and biological complexity is seen as probable, perhaps even inevitable, although the particular path taken in nature is unpredictable. Still, detailed kinetic and dynamic requirements, as well as thermodynamic ones, must be met for evolution to occur.

Peacocke then turns briefly to theological reflections. God is creator of the world through a timeless relation to it in two ways. God is totally other than the world, its transcendent ground of being. God is also immanent in the world, continuously creating all that is through its inbuilt evolutionary processes. These processes, revealed by the natural sciences, are in fact God’s action in the world, and eventually include the evolution of humanity. Thus, all-that-is is in God, but God is “more” than nature and humanity. The complex interplay of law and chance is itself “written into creation by the creator’s intention and purpose,” to emerge in time by the explorations of nature. Here Peacocke suggests the metaphor of God as a musical composer and nature as God’s composition, perhaps like a rich fugue.

But does this metaphor carry deistic overtones, as H. Montefiore claims? Not according to D. J. Bartholomew, who sees chance as conducive to the production of a world in which freedom can operate purposefully. Still, the best response to the charge of deism, as Peacocke emphasizes, is to see God’s action as immanent within natural processes. Moreover, as Rustum Roy points out, the interplay of chance and law in nature means that we should accept a similar interplay as characteristic of God’s creativity in human life and society, and we should be critical of belief in a God who “intervenes in the natural nexus for the good or ill of individuals and societies.” Peacocke concludes that just as it takes a stream to have eddies, it is the existence of the universe, flowing as it does towards overall increasing entropy, that is required if there are to be eddies of biological life.

Peacocke, Arthur. “God’s Interaction with the World: The Implications of Deterministic “Chaos” and of Interconnected and Interdependent Complexity.”

According to Arthur Peacocke, the long-established aim in science of predicting the future macroscopic states of natural systems has recently come to be recognized as unattainable in practice for those systems capable of manifesting “deterministic chaos.” The possibility of prediction has also been closely associated with the conviction that there is a causal nexus which scientific procedures will unambiguously ascertain. In this paper, Peacocke surveys the applicability of these concepts with respect to relatively simple, dynamic, law-obeying systems; to statistical properties of assemblies; to Newtonian systems which are deterministic yet unpredictable; and to “chaotic” and “dissipative” systems. In doing so he also analyzes the limitations to predictability stemming from quantum theory.

Chaotic and dissipative systems prove to be unpredictable in practice, primarily because of the nature of our knowledge of the real numbers, and possibly (and more problematically) because of quantum uncertainties. The notion of causality still proves to be applicable to these systems in an unambiguous, even if only in a probabilistic, fashion. However, for many significant interconnected and interdependent complex systems the concept of causality scarcely seems applicable, since whole-part constraints operate, whereby the state of a system-as-a-whole influences what occurs among its constituents at the microscopic level. Peacocke acknowledges that in the past this phenomenon has also, perhaps somewhat misleadingly, been denoted by himself and others as “downward” or “top- down” causation, in particular in relation to evolution and to the brain-body relation.

Peacocke then considers how to conceive of God’s relation to the world in the light of modifications in the scientific concepts of predictability and causality which the phenomena of deterministic chaos and dissipative systems on the one hand, and of “whole-part constraints” on the other hand, have induced. Consideration of the former has to take account of the possible, and as yet unclear, effects of quantum uncertainty on chaotic and dissipative systems. Peacocke concludes that, whatever is decided about those effects, the unpredictabilities for us of non-linear chaotic and dissipative systems do not, as such, help us in the problem of articulating more coherently and intelligibly how God interacts with the world, illuminating as they are concerning the flexibilities built into natural processes. The discussion is based in part on the assumption that God logically cannot know the future, since it does not exist for God to know.

However, Peacocke argues that the notion of “whole-part constraints” in interconnected and interdependent systems does provide a new conceptual resource for modeling how God might be conceived of as interacting with and influencing events in the world. This is particularly true in conjunction with a prime exemplification of the whole-part constraint in the unitive relation of the human-brain-in-the-human-body - in fact, this model of personal agency is the biblical and traditional model for God’s action in the world.  He evokes the notion of a flow of information as illuminating this ‘whole-part’ interaction of God with the world, which could then be conceived of as a communication by God to that part of the world (namely, humanity) capable of discovering God’s meanings. 

Polkinghorne, John. “The Metaphysics of Divine Action.”

In “The Metaphysics of Divine Action,” John Polkinghorne notes that any discussion of agency requires the adoption of a metaphysical view of the nature of reality. He claims that there is no “deductive” way of going “from epistemology to ontology,” but the strategy of critical realism is to maximize the connection. This leads most physicists, he claims, to interpret Heisenberg’s uncertainty principle as implying an actual indeterminacy in the physical world, rather than an ignorance of its detailed workings.

Polkinghorne is critical of physical reductionism, which makes unsubstantiated and implausible claims for the explanatory power of the idea of self-organizing systems. Moreover, it focuses strictly on the generation of large-scale structure rather than the temporal openness necessary to accommodate agency. A theological appeal to divine primary causality is too vague to yield an understanding of providential action. We need not be stymied by the problem of the “causal joint” that makes this possible. Top-down causality is a valuable idea, but it is not unproblematic and its plausibility depends upon exhibiting intrinsic gaps in the bottom-up description in order to afford it room for maneuver.

Polkinghorne believes that such gaps might originate from indeterminate quantum events. However, there are problems about amplifying their effects, and the idea also leads to an episodic account of divine agency. Polkinghorne prefers an approach based upon interpreting the unpredictabilities of chaotic dynamics (in accord with the strategy of critical realism) as indicating an ontological openness to the future whereby “active information” becomes a model for human and divine agency. He interprets sensitivity to small triggers as indicators of the vulnerability of chaotic systems to environmental factors, with the consequence that such systems have to be discussed holistically. It is not supposed, however, that such triggers are the local mechanism by which agency is exercised.

The resulting metaphysical conjecture Polkinghorne calls a complementary dual-aspect monism, in which mind and matter are opposite poles or phases of the single stuff of created reality. This scheme is antireductionist, stressing instead a contextualist approach in which the behavior of parts depends on the whole in which they participate. Polkinghorne then discusses some of the consequences of adopting this point of view, including the insight that divine agency has its own special characteristics and that God’s knowledge of the world of becoming will be truly temporal in character.

Stoeger, William R.. “Describing God’s Action in the World in Light of Scientific Knowledge of Reality.”

The approach to divine action taken by William Stoeger is to accept with critical seriousness our present and projected knowledge of reality from the sciences, philosophy and other disciplines, including theology which has already developed in response to the sciences. By “critical seriousness,” Stoeger means that this knowledge, though critically assessed by the disciplines themselves, by philosophy and by the other human sciences, does indeed indicate something about the realities it talks about. Stoeger then integrates these results into a roughly sketched theory of God’s action. Implicit here is the methodological problem of how the languages of science and theology are to be integrated.

Next Stoeger employs a philosophical presupposition which he calls a “weakly critical realist” stance. Included are elements of Aristotelianism and Thomism, particularly the notions of primary and secondary causality. These seem to him more adequate to both the scientific and the theological data. They also lead to fewer difficulties in explicating the essential differences between God and God’s cre ation, the relationships between them, and the ideas of divine immanence and transcendence. Stoeger uses the term ‘law’ in the context of both physical processes and free human actions to mean any pattern, regularity, process, or relationship, and its description. ‘Law’ is thus used to describe or explain order. It does not necessarily imply determinism.

Stoeger concludes this section by saying that there are aspects of divine action which we are able to understand better by letting science and theology critically interact. There are other aspects which seem thoroughly resistant to our understanding, particularly the nexus between God and the secondary causes through which God acts, or between God and the direct effects of divine action, such as creatio ex nihilo. The analog of human agency is of some limited help here. However, the principal barrier is that we could only know the critical nexus if we ourselves were divine, or if God revealed such knowledge to us.

Turning to the problem of God’s action, Stoeger argues that if God acts through secondary causes it would seem to require the injection of information, and therefore energy, from outside the physical system. Though we cannot rule out such injections, they have never been observed and are unattractive from many points of view. Some scholars try to evade this problem by allowing God to influence events at the quantum level. Stoeger admits that this is a solution, and may in fact be the case, but he finds it unattractive. God’s working through secondary causes is almost always a function of God’s invitation, or response, to persons. To locate such divine action at the quantum level removes it from the level of the personal. It is also unclear whether its intended effects can surface at the level of the complex and the personal.

Stoeger provides no answer to this issue, but he believes that the framework he has established may move us in the right direction. Either there is some injection of information and energy at the level of personal relationships, or God works within what is already given to make the recipient more receptive to what is available. Stoeger prefers the latter, though something of the former may be involved as well. The difficulty with higher regularities subsuming those at the lower-level is that we usually experience the lower level laws as constraining what can be done on a higher level while not being supplanted by them. Nevertheless there is a great deal left under determined by the lower-level constraints within which agents, including God, can function.

Tracy, Thomas F. “Particular Providence and the God of the Gaps.”

Thomas Tracy’s paper takes up a persistent modern problem in relating scientific descriptions of the world as a natural order and theological claims about divine action. Do some traditional ways of speaking about divine action require “gaps” in the causal order and therefore incompleteness in scientific explanations? This appears to be the case, for example, if we claim that God acts in the world at particular times and places to affect the unfolding course of events. Must this kind of theological claim compete with scientific descriptions of the world, so that we cannot both explain an event scientifically and affirm that it as a particular divine action?

Tracy considers three strategies which reply to these questions. The first avoids conflict between scientific and theological claims by insisting that, strictly speaking, God does not act in history but rather enacts history as a whole. Its paradigmatic modern development comes from Friedrich Schleiermacher, who holds that every event both stands in a relation of absolute dependence upon God’s immediate agency and is integrated into a complete system of natural causes. On this account, particular events can be singled out as acts of God only in the sense that they especially evoke in us a recognition of God’s universal activity, or play a distinctive role in advancing divine purposes “built into” the causal processes of nature. This eliminates any risk of conflict between science and theology, but it does so at the cost of imposing significant limits on the claims that can be made, for example, about the person and work of Christ, about the divine-human interaction, and about human freedom and the problem of evil. Tracy considers a contemporary and widely influential version of this strategy developed by Gordon Kaufman. Unfortunately, Kaufman’s proposal, Tracy argues, leaves us with a series of questions about how God can be understood to enact history without acting in history.

The second strategy affirms that God does act in the world to affect the course of events, but holds that this does not require any gaps in the causal structures of nature. There are at least two recent proposals that take this form. Brian Hebblethwaite contends that God acts in and through the causal powers of creatures, so that the whole network of created agencies is “pliable, or flexible, to the providential hand of God” without any gaps in the natural order. This leaves the crucial puzzle un solved, however; for if God affects the course of events once they are underway, then an explanation of those events that appeals strictly to other finite causes must be incomplete. John Compton has suggested another way to pursue this second strategy. Just as we routinely describe certain movements of the human body both as a series of physical events and as intentional action, so we can describe events in the world both as part of a causally complete natural order and as acts of God. Compton’s proposal hinges on the claim that the language we use in discussing physical events, on the one hand, and intentional actions, on the other, are not interdependent. But this claim, Tracy argues, cannot be sustained even within the terms of Compton’s own discussion. These two versions of the second strat egy, then, are undone by internal inconsistencies.

The third strategy grants that theologically motivated talk of particular divine action carries with it a commitment to the causal incompleteness of the natural order, and then argues that this is at least consonant with contemporary physical theory. Two key issues must be addressed by any such proposal. First, a case must be made that the natural sciences now describe a world whose causal structure is “open” in certain respects. Second, it must be shown that this openness is relevant to the theological concern with divine action. Tracy argues that chaos theory, for all its power to demonstrate the limits of predictability, does not provide the needed openness, since it presupposes an unbroken causal determinism. More promising are interpretations of quantum mechanics that acknowledge the role in nature of indeterministic chance. With regard to the second question, Tracy contends that such chance (whether at the quantum level or elsewhere) will be theologically interesting if the determination of such events by God can make a macroscopic difference. If so, then God could affect the course of events without disrupting the structures of nature, since they will provide for both novelty and regularity in the world. 

Wildman, Wesley J. and Robert John Russell. “Chaos: A Mathematical Introduction with Philosophical Reflections.”

Wesley J. Wildman and Robert John Russell’s article surveys the mathematical details of a single equation, the logistic equation, which has become a hallmark of this field, at least within the circles of “theology and science.” The logistic equation displays many of the generic features of chaotic dynamical systems: the transition from regular to apparently random behavior, the presence of period- doubling bifurcation cascades, the influence of attractors, and the underlying characteristics of a fractal. They then raise philosophical questions based on the mathematical analysis and conclude with possible theological implications. 

The logistic equation is a simple, quadratic equation or “map,” xn+1 = kxn(1-xn), which iteratively generates a sequences of states of the system represented by the variable x. The tuning constant k represents the influence of the environment on the system. One starts from an initial state x0 and a specified value for the tuning constant k to generate x1. Substituting x1 back into the map generates x2, and so on. Although incredibly simple at face value, the logistic map actually displays remarkably complex behavior, much of which is still the focus of active scientific research.

The behavior of the iterated sequence produced by the logistic map can be divided into five regimes. The constant k determines which regime the sequence occupies as well as much of the behavior within that regime. In Regime I, the sequence converges to 0. In Regime II, the sequence converges on a single positive limit which depends on k. In Regime III, bifurcations set in and increase in powers of two as k increases. Moreover, the initial conditions have a significant permanent effect on the system in the form of “phase shifts.” Chaos sets in in Regime IV. Here chaotic sequences are separated by densely packed bifurcation regions and there is maximal dependence on initial conditions. For most values of k, the sequences seem to fluctuate at random and the periodic points found in previous regimes appear to be absent. Nevertheless, for almost all values of k we actually find highly intricate bifurcation structures, and the sequences fall within broad bands, suggesting an underlying orderliness to the system. Finally in Regime V, chaos is found on the Cantor subset of x. 

There is no universally accepted mathematical definition of chaos capturing all cases of interest.  Defining chaos simply as randomness proves too vague because this term acquires new and more precise shades of meaning in the mathematics of chaos theory.  Defining chaos in terms of sensitive dependence on initial conditions (the butterfly effect) results in the inclusion of many maps that otherwise display no chaotic behavior.  The definition adopted here requires a chaotic map to meet three conditions:  mixing (the effect of repeated stretching and folding), density of periodic points (a condition suggesting orderliness), and sensitive dependence.  Interestingly, in the case of the logistic map and many similar chaotic maps, mixing is the fundamental condition, as it entails the other two. 

The paper also addresses the question of the predictability of chaotic systems.  On the one hand, a chaotic system such as the logistic map is predictable in principle, since the sequence of iterations is generated by a strict governing equation.  On the other hand, chaotic systems are “eventu ally unpredictable” in practice, since most values of the initial conditions cannot be specified precisely, and even if they could, the information necessary to specify them cannot be stored physically. Yet these systems are also “temporarily predictable” in practice, since one can predict the amount of time which will elapse before mathematical calculations will cease to match the state of the system.  This leads to a definition of ‘chaotic randomness’ as a tertium quid between strict randomness (as in one common interpretation of quantum physics), and the complete absence of randomness. 

What implications does mathematical chaos have for a philosophy of nature?  It is superficial to say that the mathematical determinism of chaotic equations requires metaphysical determinism in nature, because of complexities in the experimental testing of the mathematical models used in chaos theory. In particular, it may be very difficult to distinguish phenomenologically between chaos, sufficiently complicated periodicity, and strict randomness, even though these are entirely distinct mathematically. There are additional practical limitations to the testing of chaotic models of natural systems, including sensitivity to the effects of the environment (such as heat noise or long-range interactions), and the fact that the development of the physical system eventually out paces even the fastest calculations.

Two philosophical conclusions are drawn from this.  On the one hand, the causal whole-part relations between environment and system, the causal connnectedness implied in the butterfly effect, and the fact that much of the apparent randomness of nature can now be brought under the umbrella of chaos, are best seen as supporting evidence for the hypothesis of metaphysical determinism.  On the other hand, however, there are profound epistemic and explanatory limitations on the testing of chaos theory due to the peculiar nature of chaotic randomness.  In this sense, chaos theory places a fundamental and unexpected new limit on how well the hypothesis of metaphysical determinism can be supported.

On the basis of these philosophical conclusions, what relevance does chaos theory have for theology? On the one hand, it will be “bad news” to those who simply assume that nature is open to the free actions of God and people, and particularly bad news to those who mistakenly appeal to chaos theory to establish this.  On the other hand, chaos theory will be irrelevant to theologians operating with a supervening solution to the problem of divine action, such as Kant’s, that is able to affirm human freedom and divine action even in the presence of strict metaphysical determinism.  At still another level chaos theory is “good news” to the theological project and “bad news” for “polemical determinists.”  Due to the fundamental, new limitation in the testability of chaos theory, one can never fully exclude the possibility that classical physics as we now have it, including chaos theory, will be replaced by a better model of the world at the classical level which allows for divine causality in some way. This “opens a window of hope for speaking intelligibly about special, natural-law-conforming divine acts, and it is a window that seems to be impossible in principle to close.”

The article includes an extended bibliography of textbooks, key technical articles, experimental applications, useful introductions and surveys, and selected works on chaos theory and theology.

Evolutionary and Molecular Biology

This collection of twenty-two research papers is the result of the third of five international research conferences co-sponsored by the Vatican Observatory in Rome and the Center for Theology and the Natural Sciences.  The papers explore the creative interaction between evolutionary and molecular biology, philosophy, and theology.  The topics addressed in the papers include an extensive introduction to the scientific background; evolution and divine action particularly in terms of teleology; religious interpretations of biological themes especially as related to evolution; and the problem of evil from a biological and ethical perspective.

Ayala, Francisco J. “The Evolution of Life: An Overview.”

According to Francisco Ayala, the evolution of organisms, that is, their descent with modification from a common origin, is at the core of biology. Though evolution is universally accepted by biologists, its mechanisms are still actively investigated and debated by scientists. Darwin’s explanation was essentially correct but incomplete, awaiting the discoveries and power of genetics and molecular biology. Ayala then distinguishes between two questions: whether and how evolution happened.

Ayala briefly traces historical sources and then focuses on Darwin, who proposed natural selection to account for the adaptive organization of living creatures and their apparent purpose or design. Missing in Darwin’s work was a theory of inheritance that would account for the preservation of variations on which selection could act. Mendelian genetics eventually provided the “missing link.” In addition, Weismann’s germ-plasma theory helped counter the Lamarckian alternative to Darwin and contributed to the neo-Darwinian theory that emerged out of the nineteenth century. Further progress came from Dobzhansky in the 1930s. In 1953, Watson and Crick discovered the structure of DNA. In 1968, Kimura’s work on “molecular clocks” made possible a reconstruction of the evolutionary history of life with its many branchings. Finally, the recent techniques of DNA cloning and sequencing have provided additional knowledge about evolution.

Next Ayala discusses three related issues: the fact of evolution, the details of evolutionary history in which lineages split, and the mechanisms by which evolution occurs. The first, that organisms are related by common descent with modification, is both fundamental to evolution and heavily supported by the evidence. The second and third are mixed, with some conclusions well established and others less so. Before delving into the details, Ayala briefly comments on the mix of responses to evolution from the religious communities. It can seem incompatible to those holding to a literal interpretation of Genesis, the immortality of the soul, or humans created in the image of God. To others, God is seen as operating through intermediate, natural causes, including those involved in evolution. Here Ayala cites Pope John Paul II’s recent comments on evolutionary biology. Ayala then turns to a detailed exposition of the evidence for evolution, drawing on paleontology, comparative anatomy, biogeography, embryology, biochemistry, molecular genetics, and other fields. He focuses on the question of speciation, including models such as adaptive radiation for how reproductive isolation arises. After giving a reconstruction of evolutionary history, Ayala concludes his essay by discussing gradual and punctuated evolution, DNA and protein evolution, the molecular clock of evolution, and human evolution.

Ayala, Francisco J. “Darwin’s Devolution: Design Without Designer.”

According to Francisco Ayala, Darwin’s achievement was to complete the Copernican revolution; biology could now be explained in terms of universal, immanent, natural laws without resorting explicitly to a Creator. The result was to bring biological organisms into the realm of science. Many theologians have seen no contradiction between Darwin and Christian faith, both at the time of Darwin’s writings and in the century since. Natural selection is creative: in a “sieve- like” way it retains rare but useful genes. But natural selection is not creative in the Christian sense of creatio ex nihilo. Instead it is like a painter mixing pigments on a canvas. It is a non- random process that promotes adaptation, that is, combinations useful to the organisms. By proceeding “stepwise,” it produces combinations of genes that otherwise would be highly improbable. It lacks foresight or a preconceived plan, being the consequence of differential reproduction. Thus, though it has the appearance of purposefulness, it does not anticipate the environment of the future. It accounts for the “design” of organisms, since adaptive variations increase relative survival and reproduction. Aquinas and Paley understood that purely random processes will not account for biological nature; but they could not recognize, as Darwin saw, that these processes could be “oriented” by the functional design they convey to organisms. In this sense they are not entirely random. Chance is an integral part of evolution, but its random character is counteracted by natural selection which preserves what is useful and eliminates the harmful. Without mutation, evolution could not happen. Without natural selection, mutations would bring disorganization and extinction. Thanks to Darwin we can view the process of evolution as creative though not conscious. The biological world is the result of natural processes governed by natural laws, and this vision has forever changed how we perceive ourselves and our place in the universe.

Ayala next develops a complex conception of teleology. An object or behavior is teleological when it gives evidence of design or appears to be directed toward certain ends. Features of organisms, such as the wings of a bird, are teleological when they are adaptations which originate by natural selection and when they function to increase the reproductive success of their carriers. Inanimate objects and processes, such as a salt molecule or a mountain, are not teleological since they are not directed towards specific ends. Teleological explanations, in turn, account for the existence of teleological features. Ayala then distinguishes between those actions or objects which are purposeful and those which are not. The former exhibit artificial or external teleology. Those resulting from actions which are not purposeful exhibit natural or internal teleology. Bounded natural teleology, in turn, describes an end-state reached in spite of environmental fluctuations, whereas unbounded teleology refers to an end-state that is not specifically predetermined, but results from one of several available alternatives. The adaptations of organisms are teleological in this indeterminate sense. Finally, teleological explanations are fully compatible with efficient causal explanations, and in some cases both are required.

With this in mind Ayala argues that Darwin’s theory of evolution and his explanation of design are no more “anti-Christian” than are Newton’s laws of motion. Divine action should not be sought in terms of gaps in the scientific account of nature - although the origin of the universe will always remain outside the bounds of scientific explanation.

The essay concludes by acknowledging the success of science as a way of knowing, a major source of economic growth in the United States, a bringer of essential technologies, and a mode of accumulating knowledge that spans generations. Still, science is not the only way of knowing; we also have the arts, common sense, religion, and so on, all of which far predate science. Science is universal in scope but hopelessly incomplete. Much of what is left out, such as meaning and value, may be considered more important than what science includes.

Barbour, Ian G. “Five Models of God and Evolution.”

In the first part of his paper, Ian Barbour describes the evolution of Darwinism over the past century. Charles Darwin actually shared many of the mechanistic assumptions of Newtonian science. By the early twentieth century, population genetics focused on statistical changes in the gene pool and the “modern synthesis” took a gradualist view of evolution. The discovery of the structure of DNA in 1953 led to the central dogma of molecular biology: information flows from DNA to protein. Recent theories have explored selection at a variety of levels including gene, organism, kin, group, and species, as well as punctuated equilibrium. Other biologists have noted that mutation and selection are not the only sources of novelty. While these new theories can be seen as extensions of Darwinism, a few scientists, such as Stuart Kaufman, claim they are moving beyond Darwinism by invoking principles of self-organization and holism.

Barbour then outlines four philosophical issues which characterize the interpretation of evolution. Self-organization is the expression of built-in potentialities and constraints in complex hierarchically-organized systems. This may help to account for the directionality of evolutionary history without denying the role of law and chance. Indeterminacy is a pervasive characteristic of the biological world. Unpredictability sometimes only reflects human ignorance, but in the interpretation of quantum theory, indeterminacy is a feature of the microscopic world and its effects can be amplified by non-linear biological systems. He also argues for top-down causality in which higher-level events impose boundary conditions on lower levels without violating lower- level laws and he places top-down causality within the broader framework of holism. He distinguishes between methodological, epistemological, and ontological reduction. Communication of information is another important concept in many fields of science, from the functioning of DNA to metabolic and immune systems and human language. In each case, a message is effective only in a context of interpretation and response.

According to Barbour, each of these has been used as a non-interventionist model of God’s relation to the world in recent writings. If God is the designer of a self-organizing process as Paul Davies suggests, it would imply that God respects the world’s integrity and human freedom. Theodicy is a more tractable problem if suffering and death are inescapable features of an evolutionary process for which God is not directly responsible. But do we end up with the absentee God of deism? The neo-Thomist view of God as primary cause working through secondary causes as defended by Bill Stoeger tries to escape this conclusion, but Barbour thinks it undermines human freedom. Alternatively, God as providential determiner of indeterminacies could actualize one of the potentialities present in a quantum probability distribution. Selection of one of the co-existing potentialities would communicate information without energy input, since the energy of the alternative outcomes is identical. Does God then control all quantum indetermi nacies - or only some of them? Barbour comments on the way these options have been discussed by George Ellis, Nancey Murphy, Robert Russell, and Thomas Tracy. God as top-down cause might represent divine action on “the world as a whole,” as Arthur Peacocke maintains together with his “whole-part” models. But these are problematic according to Barbour since the universe does not have a spatial boundary, and the concept of “the-world-as-a-whole” is inconsistent with relativity theory. Grace Jentzen and Sallie McFague view the world as God’s body but Barbour is concerned that this model breaks down when applied to the cosmos. God as communicator of information would act through the pattern of events in the world, in human thought, and in Christ’s life as God’s self-expression, but this model does not capture God’s intention in creating loving and responsible people.

Process theology offers a fifth model of God’s action in the world by providing a distinctive theme: the interiority of all integrated events viewed as moments of experience. Rudimentary forms of perception, memory, and response are present in lower organisms; sentience, purposiveness, and anticipation are found in vertebrates. But process authors maintain that consciousness occurs only at the highest levels of complex organisms. There is great diversity in the ways in which components are organized in complex systems, and therefore great differences in the types of experience that can occur.

The process model resembles but differs from each of the four models above. God as designer of self-organizing systems is a source of order, but the God of process thought is also a source of novelty. God acts in indeterminacies at the quantum level, but also within integrated entities at higher levels. God acts as top-down cause, not through the cosmic whole but within each integrated system which is part of a hierarchy of interconnected levels. Communication of information can occur through events at any level, not primarily through quantum events at the bottom or the cosmic whole at the top. God is persuasive, with power intermediate between the omnipotent God of classical theism and the absentee God of deism. God is present in the unfolding of every event, but God never exclusively determines the outcome. This is consistent with the theme of God’s self-limitation in contemporary theology and with the feminist advocacy of power as empowerment. Process theology has much in common with the biblical understanding of the Holy Spirit as God’s activity in the world. Barbour concludes by considering some objections to process thought concerning panexperientialism, God’s power, the charge of being a “gaps” approach, and the abstract character of philosophical categories in the context of theology.

Birch, Charles. “Neo-Darwinism, Self-organization, and Divine Action in Evolution.”

According to Charles Birch, most biologists now accept neo-Darwinism as the methodological basis of their understanding of biological evolution, supplemented by the concept of self-organization. 1) Research areas in neo-Darwinian evolutionary theory include differences in the object of selection, the problem of deleterious mutations, subtleties concerning the role of chance, the genetic assimilation of environmental effects, influences on natural selection by modification of the environment, and both neutral and punctuated theories of evolution. 2) Self- organization refers to the production of complex order without a centralizing agency and is usually invoked to explain the evolution of the pre-biotic world. It may also help to explain complex processes in developmental biology, including cell differentiation. Stuart Kaufman applies the mathematics of chaos theory to some aspects of biological evolution without appealing to natural selection.

Birch acknowledges that both neo-Darwinism and self-organization draw on strictly mechanistic models, but insists that this does not imply that biological entities are in all respects machines. On the one hand, a mechanistic analysis seems to provide all that we need for modern biology from a purely physical perspective. On the other hand, it has little, if anything, to say about the mental experience of biological creatures, namely their experience of freedom, choice, and in the human case at least, self-determination. Such analysis has even less to say about the possibility of divine action in the living world. Birch believes this problem stems from the fact that the organism is treated methodologically by neo-Darwinists as an object and not as a subject. Compounding the problem, the mechanistic methodology has led many Darwinists to argue for an underlying mechanistic metaphysics. As a result, the evolution of mind and consciousness, and the functions which they uniquely serve in nature, have remained an enigma for Darwinism.

In its stead Birch suggests a metaphysics for biological organisms which includes their mental as well as physical aspects. The proposal is drawn from the philosophy of Whitehead, in which all individual entities from protons to people are considered to be subjects. Biological evolution is not simply a matter of change in the external relations of objects, but also one of change in the internal relations of subjects. This includes a subject’s relation to its immediate past, analogous to memory, and its relation to its possible future, analogous to anticipation. There is an ever-present urge in life which can be called purpose. Process thought thus posits mentality or experience in some form as an aspect of nature down to the level of fundamental particles. Only at the higher levels of complexity is experience actually conscious. Birch cites David Chalmers, Galen Strawson, and Henry Stapp as non-process scholars who support the validity of experience as universal in nature. The key argument is that mentality did not emerge from the non-mental at some point in the evolutionary sequence.

He then contrasts process philosophy with two alternatives: emergence and reductionism. Emergence involves a category mistake (that the “mental” emerges from the “physical”) as well as a scientific problem (drawing the line between sentience and non-sentience). Reductionism, although it is fruitful and represents most scientific analysis, is inadequate because it cannot account for the fact that the whole has properties which the parts do not; moreover, the parts become qualitatively different by being parts of the whole.

In its place, Birch claims that the best answer to the whole-part problem and the strongest argument for rejecting reductionism is the doctrine of internal relations. This process approach is compatible with a lower-to-upper causality, and it has important implications for scientific research, too, offering support for the idea of top-down causation. Theologically, the potentiality of the universe is held in the mind of God. Divine potentiality becomes concrete reality in the universe by means of persuasive love. God interacts with individual entities in three ways. First, the future is open, and God persuasively confronts entities with creative and saving possibilities for their future. Next, the entities of the world are created by God and respond to God’s feelings for the world. Finally, God responds to the world with infinite passion, taking actual entities into the divine life.

Cela-Conde, Camilo J. and Gisele Marty. “Beyond Biological Evolution: Mind, Morals, and Culture.”

Camilo Cela-Conde and Gisele Marty focus on models of human evolution that account for the development of traits in individuals, including morphological traits such as large brains and functional traits such as speech. They also consider the development of collective traits in human populations, such as language, culture, and moral codes. These models raise important theological questions regarding divine action.

Most theological attempts to address human evolution treat the development of culture separately from, or in contrast to, the evolution of morphology, but these attempts run into serious problems since cultural evolution presupposes and builds on biological evolution. Interactionist models are thus needed. Such models must also address the question of when and how such traits as a complex brain and human language emerged during the past 2.5 million years. Some scientists hold for an almost instantaneous and isolated emergence of language, but since language does not fossilize, the conjecture is hard to test empirically. Others argue for a long, gradual, and early development of language going back to Homo habilis, and relate it to the slow development of the brain, an idea which is easier to test. Another question is how to differentiate the evolution of our species from other hominids and even from other primates. The theological task, in turn, is how to relate such an understanding of human evolution, especially of language and cognition, to divine action, taking into account the elements of continuity as well as discontinuity between humans and other hominids as well as between hominids in general and other primates.

Darwin was the first to speak of both a biological mechanism for moral behavior and a distinctive “moral sense” which he attributed uniquely to humans. He explained the evident diversity of moral codes in terms of adaptation to varying environments. But how does human moral sense (or “moral altruism”) differ from the kind of biological altruism shared by so many species, and what are its genetic roots? This question leads the authors into a discussion of sociobiology as it has developed over the past two decades. In their view, it has focused on four key issues: the phenomena implied by human morality; the analogs to it at the animal level; the phylogenetic explanation of the emergence of these analogs; and the development of human morality within this framework. Though the debate has waned somewhat, Cela-Conde and Marty hope to show how it might now be reinvigorated.

Part of the challenge, in spite of the reductionism inherent in the debate, is the actual complexity of the phenomenon of human morality. A variety of approaches are being pursued. Some scientists point to the distinction between the capacity for, and the content of, moral thought. Others focus on group selection and kin selection models of altruism. Some argue for a strict separation between biological and moral altruism, while others stress their intimate connection. The authors note that, even if a strong connection is granted, reductionism can at least be partially avoided by appealing to the supervenience of moral language. Some sociobiologists have developed theories of reciprocal altruism, ultrasociality, and sociocultural fitness. Still the authors know of no model which includes all the elements required, from innate tendencies to empirical moral norms. Moreover, the complex cognitive processes implied in evaluating and making decisions suggest that the usual distinction between motive and criterion is inadequate.

Instead, in their model, Cela-Conde and Marty consider both the motive to act, the personal ethical criterion, and the set of collective values and norms. Individuals accumulate and actualize these values during the apprenticeship process, giving to the collective complex an evolutionary, changing character. They propose a phylogenetic argument that places biological and moral altruism as two successive stages in human evolution. Biological altruism is closely associated with the genetic code and belongs to the area of motivation; moral altruism is related to the personal ethical domain or the values of the group. Neither taken alone is able to explain the whole of human moral conduct. The combined development of cognitive capacity and moral behavior is sometimes called “co-evolution.” They also draw on the cognitive sciences. Here, internal rewards to the individual may be available through religious rituals, acting before public crowds, integration into small communities, and so on. They conclude by exploring the idea of universal norms directing moral behavior as typified in the first stages of sociobiology, and the idea of universal tendencies to accept moral codes as found in later, more sophisticated sociological arguments. These results and their problems point, in turn, to the need for a more complete theory linking the biological substratum to moral conduct, the influence of social groups, and the role of emotions in maintaining moral behavior.

Cela-Conde, Camilo J. “The Hominid Evolutionary Journey: A Summary.”

It is Camilo Cela-Conde’s central claim that “no straight line can be drawn from our ancestors to the modern human species.” Instead evolution depicts a much more complex picture of human evolution. A basic question is that of taxonomy: how are we to define a hominid? One way is by discovering an exclusive trait that might serve to distinguish hominids from other primates. Cela-Conde discusses but rejects such candidates as bipedalism, a large brain, an articulated language, a large coefficient of encephalization, the ability to create tools and thus culture, etc. He then takes a different approach, describing in some detail the variety of species that are considered as belonging to the hominid family. He begins with the appearance of early hominids some 4.4 million years ago and points out the many subtleties involved in attempting to classify them. He describes the complex issues surrounding the evolution of Homo erectusR, Homo neanderthalensis, and finally Homo sapiens, citing arguments against a direct link between Neanderthals and morphologically modern humans. He concludes his essay with a careful discussion of morphological and genetic studies of the origin of human beings, including two opposite models: multiregional transition and mitochondrial Eve. Although he disagrees with the widespread idea that all humankind shares one ancestral grandmother, he does support the theory of the “out-of-Africa” spread of modern humans.

Chela-Flores, Julian. “The Phenomenon of the Eukaryotic Cell.”

The focus of Julian Chela-Flores’ paper is the possibility of the evolution of life elsewhere in our solar system. He first reviews Big Bang cosmology, including its modifications by Guth and Linde. Next he turns to the origin of life on Earth from the 1920s to the present. Although scientists view organic matter as inexorably self-organized according to the laws of physics and chemistry, the complete pathway from the inanimate to life on Earth has not been reproduced experimentally, nor has the importation of organic molecules from space been ruled out. Meanwhile research is now underway in exobiology and bioastronomy via the ongoing space missions. Issues include cross-contamination of either Earth, Mars, or Europa, comparative planetology, the search for extraterrestrial homochirality (SETH), the search for extraterrestrial eukaryotes (SETE), and, since the 1960s, the search for extraterrestrial intelligence (SETI). He concludes this section with speculations on the future of evolution on earth.

Next, Chela-Flores describes recent topics including chemical evolution in the universe, the pathways from precursors to biomolecules, modern taxonomy, the terminology for single- celled organisms, and the evolution of prokaryotic cells in the Precambrian period. He then discusses the evolution of eukaryotes, including the role of oxygen and iron in their first appearance, and the identification of eukaryotes that are morphologically similar to prokaryotes. Next Chela-Flores takes us from eukaryogenesis to the appearance of intelligent life on Earth. Here he presses his case for the inevitable increase of complexity in the transition from bacteria to eukarya. Physics and chemistry imply an “imperative” appearance of life during cosmic evolution which he formulates as a bold, but in principle testable, hypothesis: “once the living process has started, then the cellular plans, or blueprints, are also of universal validity.” In short, prokaryotes lead to eukaryotes, and they do so universally. Provided that planets have the appropriate volatiles (particularly water and oxygen), Chela-Flores argues that not only life, but eukaryogenesis, is bound to occur. Within the next two decades, a new generation of space missions could test his hypothesis. Moreover, the hypothesis bears on the question whether these missions should search for Earth-like life or something entirely different. Chela-Flores gives various responses to this question, including the relevance of SETE to SETI and the significance of the discoveries of the Murchison and Allan Hills meteorites that originated on Mars.

In closing, Chela-Flores maintains that there is a second environment in our solar system, the Jovian satellite Europa, in which the eukaryogenesis hypothesis may be tested. He first describes other possible sites for extremophiles and other microorganisms, including the atmospheres of Europa, Io, Titan, and Triton, and possible hot springs at the bottom of Europa’s (putative) ocean. Then he identifies parameters that may characterize the degree of evolution of Europan biota both at the ice surface and its ocean. He concludes again that a space mission could test these ideas in the near future.

Clifford, Anne M. “Darwin’s Revolution in the Origin of Species: A Hermeneutical Study of the Movement form Natural Theology to Natural Selection."

Anne Clifford examines Darwin’s The Origin of Species in relation to nineteenth-century British natural theology. Though the latter was considered a form of science it actually offered a union between science and Christian belief in a creator. Its primary text was nature, not Genesis, and it attempted to provide evidence from nature for God’s sovereignty and purposeful design. Clifford warns us not to let the hegemony that Darwin’s theory now enjoys undercut our interest in natural theology, partly because we would not fully appreciate what Darwin’s revolution accomplished. She sets out to trace that accomplishment, being mindful of the way language in both science and theology, with its metaphorical character, shapes our claims about reality.

Her first move is to challenge the “warfare” model of the relationship between Christianity and Darwin’s theory fostered by Andrew Dickson White and John Draper. Wilberforce’s attack on evolution was actually based primarily on scientific grounds, not on concerns about biblical revelation. He accepted natural selection as a process that weeded out the unfit within a species, but he felt Darwin had not provided sufficient evidence for the evolution of new species. Wilberforce’s argument drew implicitly on Francis Bacon’s earlier distinction between the book of revelation and the book of nature. Though God was the author of both books, the distinction provided scientists freedom from forcing their results to conform to biblical texts. Darwin too drew on the two-books tradition and on a close reading of William Paley, who argued from nature to an intelligent designer. Paley went further than Bacon, though, by discussing nature’s purpose and by moving from purpose to a personal designer and thus to a personal God. He rejected randomness in nature as well as the extinction of species. The Bridgewater Treatises continued this argument, insisting on the fixity of nature and on divine sovereignty which maintains nature and natural laws. These are the actual positions that Darwin’s theory of natural selection would reject.

Data gathered from his voyage on the Beagle triggered Darwin’s “conversion” from natural theology to his theory of natural selection as an account of the variety and mutability of species. Recent discoveries in geology enhanced his account, including the ancient age of the earth and the possibility of sequencing the fossil record. Also contributing was Darwin’s knowledge of animal breeding as well as Malthus’ work on population and resources, with its focus on the struggle for existence. Darwin’s theory of natural selection and its theme of the survival of the fittest broke with natural theology not only in the concept of God as special designer of each separate species, including their direct creation and their immutability, but also with the benevolence of God. Natural theologians, it seems, had been particularly blind to the abundance of suffering and death in nature.

Clifford then analyzes the role of metaphor in science, drawing on the writings of Janet Soskice, Paul Ricoeur, and Sallie McFague. She focuses on two of Darwin’s key metaphors: “the origin of species” and “natural selection.” Darwin’s theory in effect shifted the meaning of “origins” by describing the emergence of new species while bracketing the question of the origin of life as such. He also transformed the meaning of species; rather than fixed and discrete, they came to be seen as fluid, possessing the capacity to evolve. Darwin’s metaphor, “natural selection,” combines meanings drawn from animal breeding by humans and from nature in the wild. It suggests that nature “chooses” and, though Darwin rejected vitalism, he has been read as deifying nature. Clifford also points out that Darwin considered his theory compatible with belief in God, though his personal position seems to shift from belief to agnosticism.

According to Clifford, then, Darwin did not intend a warfare against Christianity, only against natural theology, and here only in the form of a highly rationalistic Christian theism coupled to a limited body of scientific data. He challenged Paley’s watchmaker analogy that assumed a God of radical sovereignty and a passive and static world. What might we find to replace it? McFague proposes the metaphor of the universe as God’s body. Clifford modifies this by suggesting the metaphor of a mother giving birth. It brings together in dynamic tension the reproductive and evolutionary character of nature with the biblical doctrine of God as creator. It is panentheistic, rather than pantheistic, and is, according to Elizabeth Johnson, the “paradigm without equal,” drawing on a wealth of biblical texts for God’s relation to the world. Finally it is compatible with Darwin’s rejection of God as designer, the immutability of species, and it takes up his concern to acknowledge the extent of suffering in nature.

Coyne, George V., S.J. “Evolution and the Human Person: The Pope in Dialogue."

George Coyne presents an interpretive article for John Paul II’s preceding statements on evolution and the human person. Coyne sets the context by starting with the historical background of the Pope’s statement which he describes in terms of three approaches to science and religion. During the seventeenth and eighteenth centuries, the Church attempted to appropriate modern science to establish a rational foundation for religious belief. Paradoxically, this led to the corruption of faith and contributed to the rise of modern atheism. The founding of the Vatican Observatory in 1891 signals the second approach. Here the Church attempted to combat anticlericalism by a vigorous, even triumphalistic, agenda. Finally, the twentieth century has seen the Church come to view science as offering rational support for theological doctrine. Coyne cites Pope Pius XII who, in 1952, took Big Bang cosmology as “bearing witness” to the contingency of the universe and to its creation by God.

Still, in three prior statements and in the current one, John Paul II has taken a new approach. Though the Galilean controversy was important to the first two approaches, what Coyne takes to be the key element is John Paul II’s call for a genuine and open-ended dialogue in which science and religion, though distinct and marked by their own integrity, can contribute positively to each other. Dialogue sets the context for John Paul II’s discussion of evolution.

The discussion is, in fact, mostly scientific, drawing first from research in the life sciences, next from molecular chemistry to life in the evolving universe, and finally to the possibility of early primitive life on Mars and the discovery of extra-solar planets. John Paul II stresses that, though evolution is an established scientific theory, philosophy and theology enter into its formulation, leading to several distinct and competing evolutionary world-views. Some of these - materialism, reductionism, and spiritualism - are “rejected outright.” Instead a genuine dialogue begins as the papal message struggles with two views which may or may not be compatible: evolution according to science and the intervention by God to create the human soul.

Thus dialogue risks dissonance between science and religion. Revelation is given an antecedent and primary role compared with scientific discovery. Yet the religious message struggles to remain open, perhaps through a reinterpretation of what science tells us. One possibility would be the body-soul dualism taken by Pius XII. Instead John Paul shifts from an ontological to an epistemological interpretation of the appearance of what he then calls the “spiritual” in humanity. The message closes by indicating that the dialogue should continue. Here Coyne adds that in doing so we think in terms of God’s continuous creation through the process of evolution. Rather than intervening, God gives the world freedom to evolve and participates in the process through love. Perhaps this approach can preserve what is special about the emergence of spirit without resort to interventionism.

Davies, Paul. “Teleology Without Teleology: Purpose through Emergent Complexity."

Paul Davies offers us a modified version of the uniformitarian view of divine action. In selecting the laws of nature, God chooses specific laws which allow not only for chance events but also for the genuine emergence of complexity. He claims that the full gamut of natural complexity cannot be accounted for by neo-Darwinism, relativity, and quantum mechanics; one must also consider nature’s inherent powers of self-organization based on, though not reducible to, these laws. Still the emergence of complexity does not require special interventionist divine action.

Davies begins by classifying divine action into three types: interventionist, non- interventionist, and uniform. He rejects the first, since it reduces God to nature and involves theological contradictions. The second is a new possibility which appeals to quantum indeterminism and bottom-up causality or to the mind-body problem and top-down causality. Davies considers several possible objections and responds to them before turning to his own view, a modified form of uniform divine action. This emphasizes God’s continuing role in creating the universe each moment though without bringing about particular events which nature “on its own” would not have produced. Davies illustrates this via the game of chess, in which the end of a given game is determined both by the rules and by the specific sequence of moves chosen by each player. Thus God selects the laws of nature; being inherently statistical, they allow for chance events at the quantum or chaos levels as well as for human agency. God need not violate these laws in order to act, and there is room for human freedom and even for inanimate systems to explore novel pathways.

The existence of these very specific laws raises the question of cosmological design. Davies acknowledges that “anthropic” arguments like his might be countered by a cosmic Darwinism, such as the “many worlds” view provided by inflationary cosmology, but he gives several reasons why he rejects these accounts. He then argues that quasi-universal organizing principles will be found to describe self-organizing, complex systems. They will complement the laws of physics, but they would not be reducible to or derivable from physics, nor would they refer to a mystical or vitalistic addition to them.

Davies sees his view of divine action as going beyond ordinary uniformitarianism. Chance in nature is God’s bestowal of openness, freedom, and the natural capacity for creativity. The emergence of what he calls the “order of complexity” is a genuine surprise, arising out of the “order of simplicity” described by the laws of physics. He calls this “teleology without teleology.” The acid test, according to Davies, is whether we are alone in the universe. If the general trend of matter toward mind and culture is written into the laws of nature, though its form depends on the details of evolution, we would expect that life abounds in the universe. This accounts for the importance of the SETI project. Finally, Davies is open to the possibility of combining his view with a non-interventionist account of divine action.

In his final section, Davies addresses biologists who, he expects, will find the concept of “teleology without teleology” favorable for two reasons. First, biologists have already incorporated elements of self-organization and emergent complexity into the neo-Darwinian account. Second, some biologists see evidence of complexifying trends in biological phenomena. Finally, the theological interpretation he advances would in no way be obligatory on others.

Drees, Willem B. “Evolutionary Naturalism and Religion."

According to Willem Drees, at least three issues arise from an evolutionary view of nature. One is the challenge to a literalist understanding of Genesis. Another is that evolution may leave no room for divine action in the world. Finally, evolution can radically modify our understanding of human nature and morality. The latter is the focus of Drees’ paper. Rather than seeking an alternative to, or a modification of, evolutionary ideas, Drees intends to stay as close as possible to insights offered and concepts developed in the sciences. He call his position “naturalism” and asks what the consequences are if a naturalist view is correct. His central theses are: 1) upon a sufficiently subtle view of science, evolution can do justice to the richness of experience and of morality, but 2) not to the cognitive concerns of religion; nevertheless 3) there is still room in a naturalist view for religion as a way of life and as a response to limit questions concerning the scientific framework.

Drees first distinguishes between soft or nonreductive naturalism, whose context is ordinary human experience and language, and hard or reductive naturalism. The latter includes epistemological naturalism (a universal application of the scientific method without an ontological commitment) and ontological naturalism (Drees’ position). Within ontological naturalism there are three varieties: reductive materialists, who hold for type-type identity, nonreductive materialists, who opt for token-token identity (Drees’ position), and eliminative materialists, who would reduce away the higher level of discourse.

In Drees’ view, the natural world is all that we know about and interact with; no supernatural realm shows up within the world. All entities are made of the same constituents. Still naturalism (or physicalism) can be non-reductive in the sense that higher level properties may require their own concepts and explanatory schemes. Evolutionary explanations are primarily functional. He argues that such a naturalism need not be atheistic. Instead, physics and cosmology form the boundary of the natural sciences and raise speculative, limit questions about the naturalist view as such, questions about which naturalism can remain agnostic. The integrity, coherence, and completeness of reality as described by science does not imply its self-sufficiency. Contrary to Peter Atkins, Drees sees religious accounts in which the natural world as a whole is dependent on a transcendent Creator as consistent with, though not required by, naturalism. What Drees rejects is a view of God as altering the laws of nature or as acting within the contingencies of nature since, again, nature is complete and the integrity of nature is affirmed.

Next Drees turns to evolutionary explanations of morality. If morality, such as pro-social behavior, is given an evolutionary explanation, can it still be considered “moral”? Drees first argues that evolutionary naturalism as a whole should not be dismissed because of the claims made by those whom Daniel Dennett calls “greedy reductionists.” Instead Richard Alexander, David Sloan Wilson, Elliot Sober, Michael Ruse and Francisco Ayala give serious consideration to the importance of cultural and mental aspects in the evolutionary explanation of morality. He describes four reasons why such accounts need not undercut the validity of seeing morality as genuinely “moral.” For example, sociobiology undermines the claim that values originate in a supernatural source, but people are still free to choose from among competing values. Morality can go beyond our emotions, as E. O. Wilson argues, and the contingencies of our evolutionary history, as Michael Ruse proposes, to reflect a genuine distinction between is and ought.

But what happens to religion when it becomes the object of scientific study and explanation? Whereas morality and experience seem to survive an evolutionary understanding, the implications for religion are more serious. In effect, a functional and immanent understanding of morality need not be as problematic to moral persons as a similar understanding of religious language may be to believers, since religious language typically refers to transcendent realities. Thus, some of the fears by believers seem warranted. However, Drees holds that the grounds for accepting a naturalistic evolutionary view of reality, including ourselves, are strong. Hence, rather than backing away when a conflict threatens, he prefers to reflect on the options for religion within an evolutionary framework.

According to Drees, naturalism rules out objective reference to divine action in the world and it offers an evolutionary account of how such ideas arose. Thus naturalism renders their cognitive content “extremely unlikely” without claiming absolute proof. Religious traditions can be studied as complex entities and ways of life, each within its own environment. They embody regulative ideals and forms of worship, and they undergird moral and spiritual commitments. Though their cognitive claims may need revision, religions confront and challenges us with these ideals and values, offering a vision for a better world. Moreover, they encourage us to raise limit questions which naturalism alone cannot answer and they in turn offer answers to such questions. The openness expressed in limit questions can induce wonder and gratitude about the world, and this mystical function of religion can be complementary to its more prophetic, functional characteristics. Finally, evolution has bequeathed us the capacity for imagination and thus for transcending any one particular perspective or regulative ideal. This in turn leads us to the notion of divine transcendence.

Edwards, Denis. “Original Sin and Saving Grace in Evolutionary Context."

Denis Edwards is concerned with rethinking the doctrine of sin and grace in light of biological evolution. He begins with the insights of Gerd Theissen, Sallie McFague and Philip Hefner. According to Edwards, Theissen argues that the common features of science and theology can be articulated through evolutionary categories. Religion manifests the “central reality,” God. Christianity offers the principle of radical solidarity which runs counter to natural selection. The pull in us towards anti-social behavior has a biological foundation, while the work of the Holy Spirit is in the direction of pro-social behavior, helping us see strangers as kin. Theissen supports these points by referring to the three great “mutations” of Christian faith: biblical monotheism, New Testament Christology, and the experience of the Holy Spirit. Edwards then criticizes Theissen’s work in terms of both biology and theology: for example, are natural selection and culture, and natural selection and the way of Christ, each so sharply opposed?

Sallie McFague finds the pattern for divine immanence in creation in the story of Jesus: the universe is directed toward inclusive love for all, particularly the oppressed. As Edwards sees it, her “Christic paradigm” extends God’s liberating, healing, and inclusive love to non-human creatures. McFague understands nature as the new poor. She finds consonance between natural selection and Christianity, since evolution is not only biological but cultural, and since it is essential that human culture contributes to the welfare of all life on earth. But she finds dissonance between natural selection and Christianity, because neither cultural nor biological evolution includes solidarity with the oppressed. Instead, God suffers with suffering creation, since the world is God’s body. But, Edwards asks, is McFague too negative, even moralistic, about natural selection? Is the Christic paradigm opposed to natural selection or does it define God’s creative action in and through evolution?

According to Edwards, Philip Hefner sees the human being as a symbiosis of genes and culture. Religion is the central dimension of culture. In view of the ecological crisis we have brought about we need a theology of the human as created co-creator. Hefner views original sin in terms of the discrepancy in the information coming from our genes and our culture, including a clash in us between altruism and genetic selfishness. He also suggests that original sin can be understood in terms of the fallibility and limitation that are essential to human evolution and freedom. Though these are good, they are always accompanied by failure. The religious traditions carry altruistic values, particularly trans-kin altruism, and the biblical commandments ground altruism ultimately in God.

Edwards believes that Hefner’s evolutionary insights genuinely illuminate the human condition and the Christian understanding of concupiscence. He argues, however, that discrepancy and fallibility are not in themselves sin. Instead, following Rahner, he distinguishes between the disorder of sin and the disorder that is intrinsic to being human. The former comes from our rejecting God. The latter results from our being both spiritually and bodily finite; it is a form of concupiscence that is morally neutral and not in itself sinful. Our existential state is constituted by both disorders. Edwards finds Hefner’s insight as bearing on our natural, but not our sinful, disorder: the structure of the human, though a fallible symbiosis of genes and culture, is not in itself sin. In addition, Edwards suggests that our genetic inheritance can carry messages essential for human life while culture and religion can carry messages of evil. This means that selfishness and sin cannot be identified strictly with our biological side, and unselfish behavior cannot be identified entirely with our cultural side.

With regard to grace, Edwards writes that, while altruism is a radical dimension of divine and human love, it does not express the ultimate vision of that love. Indeed, indiscriminate calls to altruism and self-sacrifice can function to maintain oppression, as feminist theologians have stressed. Moreover, in a trinitarian doctrine of God, love is revealed most radically in mutual, equal, and ecstatic friendship. So, though Hefner sees altruistic love as holding the status of a cosmological and ontological principle, Edwards sees persons-in-mutual-relations as having this status. Drawing on the writings of John Zizioulas, Walter Kasper, and Catherine Mowry LaCugna, Edwards suggests that if the essence of God is relational and if everything that is springs from persons-in-relation, then this points towards an ontology which he calls “being-in- relation.” Moreover, such an ontology is partially congruent with evolutionary biology, including its stress on cooperative, coadaptive, symbiotic, and ecological relations. Contrary to Theissen and McFague who tend to oppose natural selection and the Gospel, Edwards wants the “Christic paradigm” to view God as continuously creating through the processes of evolution.

Still the struggle and pain of evolution leads Edwards to face the challenge of theodicy. Following Thomas Tracy, he first suggests that natural selection needs to be considered in non- anthropomorphic and non-moral terms as an objective process in nature, like nucleosynthesis in stars. Theodicy is no more intense a problem for natural selection than it is to all such processes, including death when understood as essential to evolution and life. The trinitarian God who creates through natural selection needs to be understood not only as relational but also as freely accepting the limitations found in loving relationships with creatures. The Incarnation and the Cross point to a conception of God related to natural selection through unthinkable vulnerability and self-limitation. The God of natural selection is thus the liberating, healing, and inclusive God of Jesus. This God is engaged with and suffers with creation; at the same time, creatures participate in God’s being and trinitarian relationships.

Ellis, George F. R. “The Thinking Underlying the New ‘Scientific’ World-views."

George Ellis analyzes arguments by a number of contemporary scientists that either

support atheism or offer a science-based religion. He claims that they are based on scientifically unjustified assumptions and rely on rhetorical or emotional appeal. They ignore the limited scope and method of science, contain an implicit metaphysical agenda, rely on the authority of science while addressing issues outside its scope, and occasionally misrepresent or ignore opposing views.

First, Ellis reminds us that scientific theories are provisional, open, limited in scope, partially supported by evidence, and inherently incomplete. Cosmology in particular takes for granted the laws of physics, but it cannot explain why they exist, why the universe exists, or whether there is an underlying purpose and meaning to the universe. Nor can science provide a foundation for values, although values are essential to the conduct of science. Issues such as the existence of God lie forever outside the competence of science to adjudicate, although the weight of data and experience can influence one’s opinion. Ellis acknowledges that the argument from special design has been undermined by evolutionary theory. His concern, however, is with those who construct a “scientific religion” out of either a physically based metaphysics or a scientifically motivated system of values, and who deny the fact that the metaphysical interpretation of science is ambiguous and that the epistemic and ethical scope of science is limited.

To make his case, Ellis turns to detailed critiques of specific authors. He admires the elegance of Carl Sagan when he writes about science, but is concerned that Sagan goes on to deploy a “naturalistic religion” whose metaphysical basis reaches beyond what science can warrant. Such fundamental issues as the existence of the universe and the laws of physics are taken for granted, and Sagan’s conclusion about the relative unimportance of human life in the universe is argued using emotion, not logic. Similarly Richard Dawkins ignores these fundamental issues and takes for granted the conditions that make evolution possible. E. O. Wilson attempts to give an exhaustive account of moral behavior as mere genetic programming but ignores the fact that morality presupposes voluntary, intentional action. Daniel Dennett and Jacques Monod ignore the metaphysical issues regarding the existence of the universe, and Monod proposes that science can be a source of universal ethics without recognizing the inadequacies of such an ethics. Finally Peter Atkins dismisses anything outside the scope of science, making reductionism into a dogma and ignoring the metaphysical ambiguity of science. The pay-off for these writers is the hope to achieve absolute certainty and to receive the privileged status of scientific “high priest.” Ironically, their exaggerations serve to foster anti-scientific views in the general public. These authors also fail to alert their readers to the speculative status of the scientific theories being considered, such as chaotic inflationary cosmology, cultural “memes,” and the arrow of time problem. Ellis is also critical of the specific strategies employed to undercut the opponent’s views, such as explaining away religion in functional and evolutionary terms, or appealing to emotion or rhetoric.

In contrast to this, some scientists who deploy scientifically based world views are less dogmatic and more tentative in their approach. Their conclusion is self-critical atheism rather than dogmatic atheism - a position which Ellis finds much more reasonable. He also admits that taking religious experience seriously is problematic, since there is too much data, too much conflict among the data, and much of it involves manifest evil wrought in the name of religion - and here he agrees with the writers he has been criticizing. This leads Ellis to reflect on how the selection of data and the construction and testing of theories in science occur. Perhaps a lesson from the proponents of scientific religion is that tests in religion need to be more seriously developed and widely acknowledged. Can morality and ethics be judged by their fruits: are they life-giving or death-dealing? Ellis believes we might use this process to evaluate the broad spectrum of religions as well as particular sects.

Ellis closes by describing three theories which take the data seriously. 1) The kenotic moral-theistic position attributes an ethical under-pinning to the universe derived from and expressing the self-emptying nature of God. It is reflected in such exemplars as Mahatma Gandhi and Martin Luther King, Jr. With this view one can cut across the lines of religious traditions to retrieve selectively those data and practices that are kenotic. One can also account for the apparent counter-evidence to God’s existence, namely evil, as well as the metaphysical ambiguity of nature, since kenosis requires free will, and free will requires genuine metaphysical ambiguity and makes evil deeds a possibility. Moreover, evolution is also entailed; special creation would make the argument from design overwhelming and faith empty. Of course, these insights are not meant as a theistic proof; the proposal provides a viable viewpoint which might or might not be true, and this is, once again, coherent with Ellis’ fundamental claim about metaphysical ambiguity.

The second theory is self-critical atheism: conflicting religious data suggest that none can be correct. This view differs from the dogmatic atheism of scientific religions, being open to evidence and aware of both metaphysical ambiguity and the limits of the scientific method. Finally, the evidence may lead to agnosticism. Ellis cautions that neither atheism nor agnosticism provide a solid basis for ethics. Ultimately the choice between these theories is personal, but truth is not irrelevant, logic plays a role, and only indubitable certainty is unattainable. This also means that by ignoring metaphysical and epistemological complications, the arguments for scientific religion bear the marks of pseudo-science rather than true science. Ellis hopes his paper clears the way for scientific world views that are less dogmatic and more open to genuine interaction with alternative views.

Haught, John F. “Darwin’s Gift to Theology."

According to John Haught, evolutionary theory can seriously undercut the credibility of divine action. Daniel Dennett views evolution as a purely algorithmic process that leaves no room for God’s action. Richard Dawkins argues that impersonal physical necessity drives genes to maximize opportunities for survival. Both conclude that Darwin has given atheism a solid foundation. Hence contemporary theology must include an apologetic dimension. At minimum, it should demonstrate that the scientific concepts involved in evolutionary theory - contingency, necessity, and the enormity of time - do not rule out the action of God. But theology should go beyond this to show that these concepts are open to metaphysical and theological grounding and that a careful understanding of God renders an evolving natural world more intelligible. Rather than a danger, Darwin offers theology a gift: the context for a doctrine of God as compassionate, suffering, and active in and fully related to the world.

This paper argues that such a theology is implied in the kenotic image of God’s self- emptying, Christ-like love. Haught draws on Dietrich Bonhoeffer, Edward Schillebeeckx, and Jürgen Moltmann in stressing divine kenosis. An evolutionary theology extends this view backward to embrace the history of life on earth and forward to its eschatological completion. With Karl Rahner we are invited to accept the humility of God and to resist what Sallie McFague calls a “power of domination.” Contrary to Dennett and Dawkins, the randomness of variations, the impersonality of natural selection, and the waste and suffering of evolution can be understood through the concept of a vulnerable God who renounces despotic force, who grounds evolution in divine love, and who participates in evolution to redeem nature.

But there is an additional problem here. Science is methodologically neutral, constrained to explain evolution without reference to the supernatural. Still, the assertions that science leaves no room for theology and that what physics depicts is the only reality lead beyond science to scientism and materialism, ideologies clearly in opposition to theology. Haught cites Stephen Jay Gould, Dennett, and Dawkins as conflating the science of evolution and the ideology of materialistic metaphysics. In response, theologians must employ a metaphysics of sufficient categorical breadth and philosophical depth to account for both the Christian experiences of God and evolutionary science, and one that counters materialism. It is Haught’s conviction that some aspects are provided by Whiteheadian philosophy, with its emphasis on novelty and temporality as irreducible features of the world.

Faith’s conviction that God’s relationship to the world is one of complete self-giving can be elaborated, at least partially, through process theology’s notion of the divine persuasive power which invites, though never forces, creation to engage in the process of becoming. Such emergent self-coherence in the evolving world is entirely consonant with the world’s radical dependence on and intimacy with God. According to Haught, union with God actually differentiates the world from God rather than dissolving it into God. In the light of such a theology, rooted in the divine kenosis, we should not be surprised at nature’s undirected evolutionary experimentation with multiple ways of adapting, or at the spontaneous creativity in natural process, or at the enormous spans of time involved in evolution. If God’s incarnate love is expressed in persuasive and relational power, a world rendered complete and perfect in every detail by God’s direct act would be metaphysically and theologically impossible. Such a world would not be truly distinct from God. It would be neither a truly graced universe, as is ours, nor meaningfully open to God’s self-communication.

Kenotic process theology emphasizes God as the sole ground of the world’s being. The sufferings and achievements of evolution take place within God’s own experience and are graced by God’s compassion. Such a theological stance, according to Haught, is not only consistent with, but ultimately explanatory of, the world seen in terms of evolutionary science - and in ways that go beyond the capabilities of materialism. Still, in light of theology’s concerns for both creation and eschatology, Haught emphasizes a “metaphysics of the future” in which the fullness of being is found not in the past or present, but in what is yet to come. The ongoing creation of the universe and the evolutionary process are made possible by God’s entering into the world from the realm of the future.

Haught realizes that science (or, more properly, scientism) is rooted in a “metaphysics of the past,” but this is a view which he believes evolutionary theology will need to include even while surpassing it. To the objection that the future cannot ‘cause’ the present, he poses the metaphorical character of both theological and scientific language, and he invokes Paul Tillich’s suggestions that we refer to God as “Ground,” rather than cause, of being. Ultimately, biblical faith rules out unique mechanical causation from past events and is commensurate with process theology’s insistence on the power of the future and on God as the ultimate source of all possibilities.

Haught returns then to his initial question: Does evolutionary theory leave room for theology? An affirmative response requires that there be an explanatory role for the idea of God in light of evolution which does not interfere with that of science. Haught provides this by pointing to three assumptions in the scientific explanation of life: the contingency of events, the laws of nature, and the irreversible, temporal character of the world. He believes theology’s task is to provide an ultimate explanation and grounding for these assumptions. Theology does so by claiming that contingent events, such as genetic mutations, signal the inbreaking of the new creation, that necessity is an expression of God’s faithfulness, and that the arrival of the divine Novum endows the world with its temporality.

Hefner, Philip. “Biocultural Evolution: A Clue to the Meaning of Nature."

Philip Hefner begins with the “two-natured” character of the human: the confluence of genetic and cultural information. These co-exist in the central nervous system (CNS) and have co- evolved and co-adapted. The genetic has made the cultural dimension possible; their symbiotic character differentiates humanity from other forms of life. Ralph Burhoe describes them as co- adapted organisms. Though we are conditioned by our evolutionary development and our ecological situation, we are free to consider appropriate behaviors within an environmental and societal matrix of demands, since our freedom serves the interest of the deterministic evolutionary system and is rooted in our “genetically controlled adaptive plasticity.” The emergence of conditionedness and freedom are an evolutionary preparation for values and morality; the ought is built into evolution and need not be imported from external sources. Hefner reports that evolutionary psychology and human behavioral ecology have moved beyond their roots in sociobiology. He notes two key issues: how adaptive behaviors are shaped by critical moments in evolutionary history and then transmitted by bundles of adaptations, and how cooperative behaviors, including morality, evolve in the context of genetic, neurobiological, and cultural interactions.

Humans have evolved to seek and to shape meaning, enabled by the CNS, and our survival depends on it. A crucial step is the construction of frameworks and interpretations which are “pre- moral,” as Solomon Katz puts it. Moreover, our species, though bounded by evolution, acts within this context, thus inevitably altering the world. Hefner speaks theologically of the human as “created co-creator.” We encounter transcendence in several ways: as evolution and the ecosystem transcend themselves when they question their purpose through us; as we act in the non-human world and in culture; and as we open ourselves to our future. Thus the “project” of the human species is also nature’s project, and the challenge for us is to discover its content.

Human culture includes diverse strategies for living; its greatest challenge is science and technology. These provide the underlying conditions for our interrelated planetary community but also its pressure on the global ecosystem. They are essential for human life and they thoroughly condition the future of the planet. Culture functions to guide behavior comparably to the role of physico-bio-genetic systems in plants and animals. Its interpretation challenges us intellectually and spiritually. Hefner opts for a non-dualistic interpretation: technology, like all culture, is an emergent form of nature grounded in human neurobiology. Still, as a thoroughly technological civilization we now face a crisis not merely of “tools out of control,” but of an all-permeating form of existence that threatens to turn against itself and nature. He calls for “a re-organization of consciousness” adequate to this crisis.

Christian theology, through the doctrine of creation, can provide such perspective. The natural world is vested in meaning by its relation to God as creator ex nihilo; nature is entirely God’s project, what God intended. The doctrine of continuing creation emphasizes the way in which, at every moment of time, God creates in freedom and love, giving the world its evolutionary character, purpose, and meaning. Our understanding of nature’s meaning arises in the context of our scientific experience of the world, including randomness and genetic predisposition. Humanity, as created in the imago Dei, becomes a metaphor for the meaning of nature. Human sin represents the epistemic distance between the actual human condition and the primordial intentionality and love that God bequeaths the world. The key question for relating theology to science should be whether we believe that what governs the world at its depths is divine love. The Incarnation and the sacraments are pivotal theological affirmations that nature is capable of being an instrument of God’s will and purpose. In Jesus Christ we discover both the normative image of God and the instantiated character of God’s freedom, intentionality, and love.

Hefner concludes with six claims regarding the created co-creator as a fully natural creature illuminating the capabilities of nature, the convergence of the human project and the project of nature, and the transcendence and freedom of both. The evolution of the created co- creator reveals that nature’s project is also God’s project, and the human project must be in the service of nature’s project. The imago Dei, and particularly Christ, gives content to God’s intentionality for this project.

Pope John Paul II. “Message to the Vatican Observatory Conference on Evolutionary and Molecular Biology."

In a cordial greeting to the 1996 conference out of which this volume emerged, John Paul II emphasized that the search by philosophers, theologians, and scientists for a fuller understanding of life in the universe and the role of humanity is consistent with the Church’s commitment to intellectual inquiry. Still, science, philosophy and theology can benefit humanity only if they are grounded in truth as found in the works of the Creator and particularly in the human person, created in God’s image. They help to clarify the vision of humanity as the focus of creation’s dynamism and the supreme object of God’s action. Thus science and the betterment of humanity are intimately linked. He closed by reaffirming his support for this series of conferences as they contribute to the exchange between religion and science.

In the following October John Paul II addressed the Plenary Session of the Pontifical Academy of Sciences. He emphasized that the questions of the origins of life and of the nature of humankind as they are explored by the sciences are of deep interest to the Church. Will the scientific conclusions coincide with, or will they appear to contradict, Revelation? Here his response is that truth cannot contradict truth. Moreover, by understanding these results and their potential impact on humankind, the Church will be strengthened in her concern regarding issues of moral conduct. He recalled the position taken by Puis XII in 1950, that evolution and theology are not in opposition regarding humanity and its vocation. He claimed that evolution, though studied as a scientific theory, can also be interpreted philosophically in many ways, and that Revelation must be considered in the process. John Paul II also noted that in 1992, when speaking in regard to Galileo, he had emphasized that exegetes and theologians must understand science as they seek a correct interpretation of Scripture.

Today the theory of evolution is no longer a mere hypothesis. It is increasingly accepted by scientists and supported by the convergence of research from a variety of fields. What then is its significance? John Paul II addressed this question by first turning to the philosophy of science, commenting on theory construction and its verification by data. A scientific theory such as evolution also draws on natural philosophy, for example, in providing an explanation for the mechanism of evolution. The result is the presence of several theories of evolution based diversely on materialist, reductionist, and spiritualist philosophies. This raises the question of the authentic role of philosophy and of theology in these discussions.

Of direct concern to the Church is the concept of human nature, particularly the imago Dei. Being of value per se and capable of forming a communion with others and with God, people cannot be subordinated as a means or instrument. John Paul II then reiterated the position stressed by Pius XII: while the human body comes from pre-existent living matter, the spiritual soul is immediately created by God. He rejected a view of mind as epiphenomenal or as emergent from matter as incompatible with the Church’s view of the nature and dignity of the human person. Instead humanity represents an ontological difference from the rest of nature. Such a claim is not irreconcilable with the physical continuity pointed to by evolution, since the transition to the spiritual is not observable using scientific methods. Here philosophy is needed to account for self- awareness, moral conscience, freedom, and so on, and theology then seeks out their ultimate meaning. John Paul II concluded by reminding us that the Bible offers us an extraordinary “message of life,” calling us to enter into eternal life while speaking of God as the living God.

Murphy, Nancey. “Supervenience and the Nonreducibility of Ethics to Biology."

Nancey Murphy considers the role of supervenience in the relation of evolutionary biology and ethics. She argues that God acts in the moral as well as the biological sphere. She also challenges the attempt to reduce morality to biology and she affirms the claim that objective moral knowledge is dependent on a theological framework.

According to Murphy, the contemporary atomist-reductionist program can be traced to the rise of modern science with the transition from the Aristotelian hylomorphism to Galilean atomism. Following the success of Newtonian mechanics, atomism was extended to chemistry. In the nineteenth century, attempts were made to extend atomism to biology and, in our century, to the human sphere, including psychology and a general epistemological theory of the relations between the sciences. In this view, the world is seen as a hierarchy of levels of complexity with a corresponding hierarchy of sciences. Causation is strictly bottom-up from the lowest level to the top of the hierarchy. Murphy draws on Francisco Ayala’s classification and lists six types of reductionism, including methodological reductionism which, by its success, lends credence to the other five. Bottom-up causality and the claim that the parts take priority over the whole embody what is, in fact, a metaphysical assumption. Modern thought also extended atomism to areas such as political philosophy and ethics through the writings of Thomas Hobbes and Immanuel Kant. Still, the determinism of the laws of physics has raised problems for human behavior. Materialists like Hobbes have simply accepted determinism, while dualists such as Descartes have sought to avoid it, but in its place have raised the mind-body problem, and the resulting separation of the natural and social sciences a century later.

Murphy concludes that a proper understanding of ethics and biology will require a thorough response to reductionism. The key will be nonreductive physicalism, originating in the work of Roy Wood Sellars, and now including the focus on emergent properties of hierarchical systems. Sellars, in emphasizing the significance of organizations and wholes, opposes Cartesian dualism, absolute idealism, and reductive materialism. Murphy then cites scientific evidence for nonreducibility: interactions with the environment portray an entity as part of a larger system and entail both top-down and whole-part analyses which complement the bottom-up approach and lead to language about emergence. The irreducibility of concepts entails the irreducibility of laws. But how are we to explain these problems with reducibility if our analysis, though multi-layered, still refers to one reality?

Here Murphy introduces the concept of supervenience in order to give a non-reducible account of morality and mental events. R. M. Hare introduced the term in 1952, and Donald Davidson developed it in 1970. It is now widely used in philosophy, though not without disagreement. Murphy’s emphasis is on circumstance: identical behavior in different circumstances could constitute different moral judgments, and such supervenient properties are not necessarily reducible. Then she returns to the hierarchical model of the sciences with a branching between the human and physical sciences she and George Ellis suggested. The human sciences are incomplete without ethics. In this scheme, the moral supervenes on the biological if biological properties constitute moral properties under the appropriate circumstances. She draws on Alasdair MacIntyre and on Philip Kitcher’s critique of E. O. Wilson to argue that claims to reduce ethics to biology are due to meta-ethical ignorance, challenging Richard Alexander and Michael Ruse in passing. Instead Murphy places ethics between the social sciences and theology. The former, rather than being value-free, now involves moral presuppositions and ethical questions. Ethics traditionally drew upon a concept of humankind’s ultimate telos. The Enlightenment severed the ties to this tradition but kept the same moral prescriptions. Modern philosophers, failing to find an objective basis for moral discourse, reduced moral claims to empirical observations. All of this is due to our forgetting that the “ought” is only half of a moral truth; it is actually connected to a telos and a specific “is” statement. Murphy next discusses the supervenience relations that obtain as we move up the scale. The predicate of virtue supervenes on psychological characteristics, but only if there are nonreducible circumstances related to the ultimate goals of human existence.

Murphy concludes with the relation between ethics and biology. Here we extend the subvenience of virtues downward in the hierarchy. According to MacIntyre, virtues are acquired and not genetically determined traits. Still ethics will have much to learn from biology. Finally, all moral systems are dependent for their justification on beliefs about reality. Murphy dismisses claims like that of Jacques Monod regarding the meaningless and purposelessness in nature. Instead, we must reject reductionism by emphasizing the top-down connections from theology to ethics as well as the general inadequacies of reductionism.

Peacocke, Arthur. “Biological Evolution–A Positive Theological Appraisal."

As Arthur Peacocke observes, the nineteenth-century theological reaction to Darwin was much more positive, and the scientific reaction much more negative, than many today care to admit. Current theology, however, is far less open - a “churlishness” which Peacocke is committed to rectifying for the sake of the believability and intellectual integrity of Christianity. To do so he turns directly to five broad features of biological evolution and the theological reflections they suggest.

The first is continuity and emergence. Although the seamless web of nature is explained by scientists using strictly natural causes, biological evolution is characterized by genuine emergence and a hierarchy of organization, including new properties, behaviors, and relations. Such emergence entails both epistemic irreducibility and a putative ontology. Emergence, in turn, is God’s action as the continuous, ongoing, and immanent Creator in and through the processes of nature.

The second feature is the mechanism of evolution. Although biologists agree on the central role of natural selection, some believe selection alone cannot account for the whole story. Peacocke describes eight approaches to the question which operate entirely within a naturalistic framework, assume a Darwinian perspective, and take chance to mean either: 1) epistemic unpredictability, arising either a) because we cannot accurately determine the initial conditions, or b) because the observed events are the outcome of the crossing of two independent causal chains, or 2) inherent unpredictability as found at the subatomic level. Chance characterizes both mutations in DNA (types 1a and/or 2), and the relation between genetic mutation and the adaptation of progeny (type 1b). Chance in turn was elevated to a metaphysical principle by Jacques Monod, who rejected God’s involvement in evolution, but Peacocke disagrees. Instead, chance connotes the many ways in which potential forms of organization are thoroughly explored in nature. Rather than being a sign of irrationality, the interplay of chance and law are creative over time. This fact, for the theist, is one of the God-endowed features of the world reflecting the Creator’s intentions.

Next Peacocke raises the question about trends, properties and functions which arise through, and are advantageous in, natural selection. Drawing on G. G. Simpson and Karl Popper, Peacocke claims that there are “propensities” for such properties. Examples include complexity, information-processing and -storage ability, and language. They characterize the gradual evolution of complex organisms and contribute to the eventual existence of persons capable of relating to God. Thus the propensities for these properties can be regarded as the intention of God who continuously creates through the evolutionary processes, though without any special action by God at, say, the level of quantum mechanics or genetic mutations.

The fourth feature is the ubiquity of pain, suffering and death in nature. Pain and suffering are the inevitable consequence of possessing systems capable of information processing and storage. Death of the individual and the extinction of species are prerequisites for the creation of biological order. Complex living structures can only evolve in a finite time if they accumulate changes achieved in simpler forms, and are not assembled de novo. This includes both the predator-prey cycle, which involves eating pre-formed complex chemical structures, and the modification of existing structures via biological evolution. This, in turn, raises the problem of theodicy. Peacocke stresses that God suffers in and with the suffering of creatures, and cites support from current theologians who reject divine impassibility. God’s purpose is to bring about the realm of persons in communion with God and with each other. Moreover, God’s suffering with Christ on the cross extends to the whole of nature. Death as the “wages of sin” cannot possibly mean biological death; this requires us to reformulate the classical theology of redemption. The reality of sin must consist in our alienation from God, a falling short of what God intends us to be. It arises because, through evolution, we gain self-consciousness and freedom, and with them, egotism and the possibility of their misuse.

In his final section Peacocke turns to the theological significance of Jesus Christ in an evolutionary perspective. Christ’s resurrection shows that such union with God cannot be broken even by death. His invitation to follow him calls us to be transformed by God’s act of new creation within human history. But how is this possible for us now? This leads Peacocke to the problem of atonement. Since he rejects objective theories that link biological death to sin and the Fall, the suffering of God and the action of the Holy Spirit in us together must effect our “at-one- ment” with God and enable God to take us into the divine life.

Peters, Ted. “Playing God with Our Evolutionary Future."

The theory of evolution leads Ted Peters to emphasize that God’s creation is not fixed but changing, and this may apply even to human nature. But should we seek to influence our own genetic future? Some denounce this as “playing God,” especially when it comes to germ-line intervention, but Peters analyzes this term in light of a Christian theology of creation. He claims that God’s creative activity gives the world a future, that humans are “created co-creators,” as Hefner puts it, and that we should be open to improving the human genetic lot and thus to influencing our evolutionary future.

Peters suggests three meanings for the term, “playing God”: learning about God’s awesome secrets through science and technology; making life and death decisions in medical emergencies; and substituting ourselves for God. Concern for the third may lead to an attack on human pride in when we confuse knowledge for wisdom or ignore the problem of unforeseen consequences of germ-line intervention. A separate concern is that DNA is sacred and should be off-limits to humans. Here Peters cites Jeremy Rifkin, who appeals to naturalism or vitalism in defense of leaving nature alone. Still neither Christian theologians nor molecular biologists are likely to agree with Rifkin - though for different reasons.

Actually the term “playing God” raises the question of the relationship between God and creation in terms of both creatio ex nihilo and creatio continua. In Peters’ view, giving the world a future is God’s fundamental way of acting as creator. God creates new things including the new creation yet to come. The human is a “created co-creator”: we are part of what God alone creates ex nihilo, yet we can have a special influence of the direction of what God continues to create. Indeed the meaning of imago Dei may well be creativity. Though the results of human creativity are deeply ambiguous ethically, we cannot not be creative. The ethical mandate concentrates instead on the purposes towards which we direct our work

Peters next turns to the Human Genome Project (HGP). The aim of the HGP is new knowledge and the betterment of human health, but the germ-line debate is over taking actions now that might potentially improve the health or relieve the suffering of people who do not yet exist. Should we instead only engage in somatic therapy or should we undertake germ-line therapy, and should the purposes of the latter include enhancement? Peters cites a number of church documents which call for caution or for limitations to somatic procedures; still others are open to germ-line therapy. At stake is the implicit association with eugenics.

By and large, religious ethical thinking is conservative, seeking to preserve the present human gene pool for the indefinite future. From this perspective, germ-line intervention triggers the sense of “playing God” both physically (we might degrade the biodiversity required for good health) and socially (we might contribute to stigma and discrimination). But Peters is critical of this perspective, because it assumes that the present state of affairs is adequate, and it ignores the correlation between conceiving of God as the creator and the human as created co-creator. Instead, we are called to envision a truly better future and press on towards it, though certainly with caution, prudence, and a clear concern for our hubris.

After recounting the summary of arguments for and against germ-line intervention stipulated by Eric Juengst, Peters turns to the position paper by the Council for Responsible Genetics (CRG). The strongest argument is that germ-line intervention will reinforce social discrimination. Peters endorses the CRG’s concern: the definition of the ideal norm may be governed by economic and political advantage. He reaffirms human dignity regardless of genetic inheritance and the technical possibilities in genetic engineering, but he recognizes that the underlying reasons for prejudice and discrimination today are not germ-line intervention. A more serious challenge raised by the CRG concerns persons who do not yet exist: should they be given moral priority over present populations? Future generations might blame us for acting, or not acting, in terms of germ-line interventions, and, arguably, we are morally accountable to them. But the problem is complex, as Hardy Jones and John A. Robertson point out, since actual, generational differences would occur depending on whether or not we intervened. We clearly need an ethical framework that is grounded in God’s will for the future, for the flourishing of all humanity, and for that which transcends particular concerns for contingent persons.

And so Peters asks the key question: would a future-oriented theology, with the human as co-creator, be more adequate than the CRG’s proposal? His response is affirmative: a future- oriented theology would not give priority to existing persons; it is realistic about nature as inherently dynamic; and our task is to seek to discern God’s purpose for the future and conform to it, while resisting the status quo. Rather than “playing God,” in directing ourselves to that future we are being, in Peters’ words, truly human.

Russell, Robert John. “Special Providence and Genetic Mutation: A New Defense of Theistic Evolution."

Robert Russell works within the context of theistic evolution: biological evolution is God’s way of creating life. God is both the transcendent source ex nihilo of the universe as a whole, including its sheer existence at each moment and the laws of nature, and the immanent Creator of all physical and biological complexity, acting continuously in, with, under, and through the processes of nature. But can we press the case further and think of God’s special providence in nature? And can we do so without viewing God’s action as an intervention into these processes and a violation of the laws of nature?

To many theologians, the connection between special providence and intervention has seemed unavoidable, leaving them with a forced option. 1) Liberals, attempting to avoid interventionism, reduce special providence to our subjective response to what is simply God’s uniform action. 2) Conservatives support objective special providence and accept its interventionist implications. The purpose of Russell’s paper is to move us beyond these options to a new approach: a non-interventionist understanding of objective special providence. This is only possible theologically if nature, at some level, can be interpreted philosophically as ontologically indeterministic in light of contemporary science. Russell’s claim is that quantum mechanics provides one such possibility. Moreover, since quantum mechanics underlies the processes of genetic mutation, and since mutation together with natural selection constitute the central features of the neo-Darwinian understanding of evolution, then we can view evolution theologically as genuinely open to objective special providence without being forced into interventionism.

In section two, Russell claims that his project is neither a form of natural theology, of physico-theology, nor an argument from design. Instead it is part of a general constructive trinitarian theology pursued as fides quaerens intellectum. He suggests why a non-interventionist view of objective special providence should be important theologically. He argues for an indeterministic interpretation of quantum physics. Finally he address three scientific issues regarding the role of quantum mechanics in genetic mutation and the role of genetic variation in biological evolution.

Section three reviews the history of the project, beginning with the writings of Karl Heim and William Pollard in the 1950s and including recent works by Arthur Peacocke and John Polkinghorne. One key question is whether God acts in all quantum events (as Nancey Murphy claims) or merely in some (as Tom Tracy suggests). Another regards the problem of theodicy when God is taken as acting throughout evolution. Russell closes this section by reflecting on issues raised by these authors.

Section four addresses three caveats. First, Russell’s hypothesis is not meant as an explanation of “how” God acts, but merely one domain where the effects of God’s special action might occur. Second, it is not meant as either an epistemic or an ontological “gaps” argument. Still quantum mechanics may one day be replaced. Russell’s methodology is intentionally designed to handle “gaps” like this by incorporating implications from physics and philosophy into constructive theology while keeping theology open to changes in these implications. Third, Russell’s argument is not meant to exclude divine action at other levels in nature or “top-down” and “whole-part” approaches. However, these are unintelligible without intervention until the evolution of sufficiently complex phenomena. This leaves a bottom-up approach via quantum mechanics the most reasonable option for the early sweep of evolution.

Section five engages two final challenges. First, “chance” in evolution also challenges the possibility of God achieving a future purpose by acting in the present. Russell responds that God acts not by foreseeing the future from the present but by eternally seeing the future in its own present. In passing Russell comments on a potential conflict with the implications of special relativity regarding this claim. The second challenge is theodicy. Russell notes that suffering, disease and death are conditions required for the evolution of freedom and moral agency. He suggests that we relocate the question of God’s action in evolution to a theology of redemption and eschatology if we are to address adequately the problem of theodicy.

Stoeger, Willam R., S.J. “The Immanent Directionality of the Evolutionary Process, and its Relationship to Teleology."

Is there an immanent directionality in nature? If so, can science discover it or must we turn to philosophy and theology to recognize it and its significance? According to William Stoeger, some scientists and philosophers conclude from the variety and interrelatedness of nature that there must be a universal plan to it all. Many even assume there must be overarching holistic laws of nature which constrain the universe to behave in ways which clearly manifest purposes or ends. Many other scientists and philosophers, however, report they have found no evidence of an immanent directionality in nature. Often it is even presupposed that there is no overall directionality - much less teleology - in evolution, that complete randomness and uncertainty prevail, presided over only by the laws of physics, chemistry, and biology.

Stoeger’s aim is to show first that there is a directionality, perhaps even a teleology, immanent in nature that can be discovered through the natural sciences as they study the emergence of physical and biological structure, complexity, life, and mind. He intentionally stresses this point since so many scientists deny it. Stoeger, however, believes that the discoveries of the natural sciences can be harmonized with an adequate understanding of God’s creative action in the world without postulating holistic laws or teleological mechanisms beyond those described by the sciences. The evidence at the scientific level also seems to rule out the necessity, and even the possibility, for divine intervention to complement the principles and processes accessible to science. Finally theology can refer to divine action and teleology, but the results of science should place constraints on the way it describes them. Moreover the laws of nature, as they function within creation, are one of the key ways in which God acts in the universe.

First, Stoeger describes the epistemological and metaphysical assumptions which underlie his approach. He then turns to a lengthy discussion of the scientific account of directionality drawing on such areas as cosmology, astronomy, chemistry, geophysics, biology, self- organization, and Boolean networks. The global cosmic directionality is given by the expanding, cooling universe. More specific sequential focusing of directionality occurs as galaxies and stars form, and they in turn provide stellar and planetary environments in which chemical and biological complexifications may arise. Stoeger goes into considerable scientific detail to show that a definite directionality is established, maintained, and narrowed in the process. Randomness does play an essential role, as do catastrophes - enabling the emergence of variety and diversity - but always within the larger framework of order and regularity.

Thus, the directionality inherent in the evolutionary process is seen in terms of its hierarchically nested character and the way this reflects the structure of the universe as a whole. This means that, for a particular configuration at a given moment, only a certain range of configurations at successive moments are possible. This hierarchical nestedness means further that these directed configurations occur on all levels - with very general types of directionality being characteristic of more global levels (those of the observable universe, or of our own galaxy) and more focused specific directionalities arising on more local levels (those of a given planet, a given organism, or community of organisms).

But in what sense does this directionality constitute a teleology? Stoeger argues that a system can be teleological without necessarily involving a blueprint for a final product. It need only move towards realizing possibilities in an ordered way, partly under the evolving conditions of its ecological environment. The realization of any given possibility typically presupposes the prior realization of other possibilities. The directionalities in nature are flexible and pliable, not fixed. Nor do they indicate consciously directed intention, at least from the point of view of the natural sciences - nor do they rule it out, and this is partly due to their own limitations. Stoeger considers a similar question regarding philosophy before turning to theology. Here he claims that the Christian tradition inevitably involves conscious divine purpose in creation, including the overall process of evolution. He concludes with some thoughts on the difference between “end- resulting,” “end-directed,” and “goal-seeking” forms of teleology.

Tracy, Thomas F. “Evolution, Divine Action, and the Problem of Evil."

Thomas Tracy addresses two of the central challenges that evolutionary theory poses for theology. First, how might we understand God as creatively at work in evolutionary history? Given the prominent role played by chance in evolution, Jacques Monod and others have contended that the meandering pathways of life cannot enact the purposes of God. Second, how can the affirmation of God’s perfect goodness and creative power be reconciled with the ubiquitous struggle, suffering, and death that characterizes evolution? This, of course, is a form of the problem of evil.

The first question demands a theological interpretation of evolution. Tracy distinguishes his project both from natural theology, which attempts to argue from nature to God, and from any theological competition with biology, which attempts to show the inadequacy of naturalistic explanations in order to substitute theological ones. Instead, given the best current evolutionary theory, Tracy asks how might a theologian concerned with divine action and providence conceive of God’s relation to the history of life?

In response, Tracy argues that there are several ways in which God might be understood to act in and through evolutionary processes. God acts universally to create and sustain all finite things. In so doing, God may choose to fix the course of events in the world by establishing deterministic natural laws. In this case every event in cosmic history could be regarded as an act of God. There are good reasons, both theological and scientific, to reject this universal determinism. From science it appears that indeterministic chance is built into the structures of nature, and that chance events at the quantum level can both constitute the stable properties of macroscopic entities and effect the course of macroscopic processes. Chaotic dynamics and evolutionary biology provide two key examples here. This in turn creates several fascinating possibilities for conceiving of divine action in the world. Perhaps a “hands-off” God leaves some features of the world’s history up to chance. Or perhaps God chooses to act at some or all of these points of indeterminism. Then God could in this way initiate particular causal chains without intervening in the regular processes of nature. Tracy notes that there are conceptual puzzles raised by each of these ideas, such as whether God determines all or just some of these events - and thus the relative theological merits of each would need to be debated. But this variety of options for conceiving of divine action makes it clear that the first challenge can be met.

If we succeed in constructing a theological interpretation of evolution, we are immediately confronted by a compelling form of the problem of evil. How can belief in God’s loving care for creation be reconciled not just with moral evil in the human sphere but also with the hardship, pain, suffering, and death that characterizes evolutionary processes, or what is called “natural evil”? According to Tracy, any morally sufficient response must identify the good for the sake of which evil is permitted, and it must explain the relation of evil to this good. One standard approach is to argue that God must permit some evils as a necessary condition for achieving various goods in creation. John Hick, for example, holds that the good of “soul making” requires that persons be free to develop their intellectual, moral, and spiritual capacities by acting in an environment that is lawful, impersonal, and at an “epistemic distance” from God. This entails both that we can do moral evil and that we (and other sentient beings) will suffer from natural evil. Clearly evolution can be accommodated within such a theodicy, though Tracy admits it is probably not required.

If we grant that the good cannot be achieved without permitting these evils, we may nonetheless object that the world contains far more of them than would be necessary to serve God’s purposes. It is not difficult to think of evils that, as far as we can see, do not lead to any greater good and could readily have been prevented. Tracy’s response is that God must permit evil that does not serve as the means to a greater good if it occurs as a necessary by-product of preserving moral freedom and the integrity of the natural order. Precisely because these “pointless” evils do not generate particular goods, they will appear to us to be unnecessary. However, evils of this type must be permitted by God, and it will be up to us to prevent or ameliorate them. From the fact that we cannot see a point to an evil, therefore, it does not follow that God should have prevented it. A world that includes the good of personal relationship with God must apparently include pointless evil.

But just how much pointless evil is really required? Does the world instead contain gratuitous evils? The problem here is, first of all, in assuming that we can calculate what could be considered the minimum amount of acceptable pointless evil, and thus that we could quantify and balance goods and evils. The deeper problem is in the assumption that the world really does include gratuitous evils. In fact we cannot even conclude that some evil is gratuitous merely because we cannot think of a reason for God’s permitting it. Moreover we must recognize that we are in no position to see how each evil fits into the overall course of cosmic history, to comprehend all of the goods to which it may be relevant, or to recognize all of the consequences of eliminating it. In grappling with the reality of evil, we confront the limits of human comprehension and are forced to accept epistemic humility, as the Book of Job makes plain in God’s speech from the whirlwind. We cannot expect to solve the problem of evil. Instead the central task for Christian faith in the face of evil is to proclaim and understand what God is doing to suffer with and to redeem creation.

Wildman, Wesley J. “Evaluating the Teleological Argument for Divine Action."

But can we make a connection between the appearance of purposes or ends in nature and the reality of divine action by arguing that such ends indicate genuine teleology and that such teleology is a mode of God’s action? Wesley Wildman calls this position “the teleological argument for divine action” and provides an extensive analysis of its problematic status. Its more aggressive form as a design argument, as given by William Paley and Alfred Russel Wallace, has been undermined by evolutionary biology, but more modest forms are still possible. Wildman claims that no detailed, supportive argument from biology to theories of divine action can be given, nor can evolution destroy such theories, but biology can influence them to some degree. The principle reason is the “metaphysical ambiguity” which effects each of three major steps in the argument, though it is generally left unnoticed.

Before proceeding, Wildman distinguishes between apparent and real ends, and between open-ended and closed-ended processes. In addition, given the dubious aspects of Aristotle’s use of final causes and the equally unnecessary abandonment of teleology in contemporary causal explanation, he includes a four-fold schema regarding teleology. 1) Against Dawkins, who argues that ends are only apparent, Wildman cites Monod’s views about teleonomy and chaos. 2) Given that complex systems are open-ended, he asks whether one specific end can be achieved, and by an intentional agent? 3) Moreover, are high-level characteristics of living systems due to the complex external arrangement of their parts, or to the emergence of genuine internal relations? 4) Is teleology expressed in the laws of nature, in chance, or in some basic constituent of nature like mind?

Now Wildman turns to the first stage in the argument. Given that the appearance of ends in nature is ubiquitous, can we establish that real purposes give rise to them? And preliminary to this, are apparent ends in nature only merely apparent? According to Wildman, modern biology has produced the strongest possible reason for answering these affirmatively with its use of efficient causal explanations of apparent ends. Still the conclusion depends on two points which Wildman carefully criticizes: a principle of metaphysical minimalism and a claim that all ends in nature, outside those achieved by agents, can exhaustively be explained solely by means of efficient causes. His position includes scientific and philosophical arguments about the sufficiency of efficient causal explanations.

The second stage requires a metaphysical bridge between real ends in nature, as in the preceding stage, and broad teleological principles which can be connected with a theory of divine action. The problem here is that some teleologies are not amendable to such theories. Foremost is Aristotle’s metaphysics, with its unmoved prime mover. Others include the Chinese concept of li, the Buddhist view of a purposive nature without God, and theistic mystical theology.

In the third stage Wildman traces the links between metaphysical contextualizations of teleology and theories of divine action, stressing the metaphysical ambiguity which complicates these links. If the locus of the teleological principle is natural law, divine action can only involve the universal determination of natural possibilities and the ontological grounding of nature. If the locus includes chance, divine action can be expressed more directly. If it also includes the constituents of nature, God may offer the initial aims to actual occasions, or the material conditions for the emergence of self-organizing systems.

According to Wildman, then, the teleological argument for divine action is not easily established. There is no unbroken chain of implications from apparent ends in nature to real ends to fundamental teleological principles to the modes of divine action. Additional premises are needed to connect the chain, and none of these is furnished by biological evolution. Moreover, there are profound teleological visions that are antagonistic toward divine action and are equally well supported by evolution. The failure of the teleological argument is located in its underlying metaphysical ambiguity. On the other hand, apparent purposes in nature are not incompatible with teleological theories of divine action; indeed, the implications run more smoothly in this direction. In closing, Wildman offers a final conjecture: if one’s premise is that the universe is meaningful, then one is led to affirm that the universe has an overarching teleological sweep. Alternatively, those popular writers on biology who avoid postulating a fundamental teleological principle must either assume that the cosmos is absurd or refuse to consider the implications of their premise.

Neuroscience and the Person

This collection of twenty-one research papers is the result of the fourth of five international research conferences co-sponsored by the Vatican Observatory in Rome and the Center for Theology and the Natural Sciences.  The articles explore the creative interaction among the cognitive neurosciences, philosophy, and theology.  The volume begins by pairing ‘snapshots’ of neuroscientific research with biblical and philosophical/theological accounts of human nature to set the poles between which the volume’s dialogue proceeds.  Themes include bridges between neuroscience and philosophy; neuroscience and Christian anthropology; and contrasting reflections on the question of whether a theistic account is needed to make sense of human nature.

Arbib, Michael A. “Towards a Neuroscience of the Person."

Michael A. Arbib, in his essay “Towards a Neuroscience of the Person,” provides an excellent framework for relating the neurosciences to the concerns of the human sciences and theology. The organizing idea of his essay is the following: a discussion of what neuroscience has to contribute to an emerging science of the person will provide a bridge between the narrow foci of individual researchers’ efforts in the cognitive neurosciences, on the one hand, and the far broader but less scientifically grounded considerations of humanists, including theologians, as they seek to explicate the nature of the person.

Arbib begins with a survey of topics on which neuroscience has offered insights into mental phenomena such as memory, emotion and motivation, social behavior, and language. This sampling of scientific developments raises the question whether the cognitive neurosciences will eventually provide a framework for understanding all of the phenomena that define human nature. In particular there is the question whether the study of the brain can explain the religious dimension of human life, or whether the subject-matter of theology will always elude neuroscientific investigation. Arbib maintains that a complete science of the person must take account of theology, but argues that theology ought to be understood not as the science of God but as the study of human belief in God. This latter understanding would open the discussion for nonbelievers (such as Arbib himself) but would incorporate the former understanding of theology if God in fact exists. Neuroscience cannot address the concept of God directly but can make progress toward theological questions, especially if theology is defined in the broad sense.Another important issue is the relation of neuroscience to questions of morality. Arbib notes that both religion (even on a nontheistic account) and neuroscience can provide insight. Neuroscience cannot answer questions of right and wrong, but it can elucidate aspects of morality such as decision-making, empathy, and social behavior.Arbib then sketches the possible role of computational neuroscience in bridging levels between neuron and person. Schema theory provides a link between “cognition-level” and “neuron-level” descriptions of the person. Basic schema theory operates at the level of cognitive science, and explains mental operations and behavior in terms of functional units. There are schemas for recognition of objects, planning and control of actions, and more abstract operations as well. Mental life and behavior result from the dynamic interaction, cooperation, and competition of many schema instances. The individual can be understood as a self-organized “schema encyclopedia.” Schema theory provides a bridge between neuroscience and the humanities: it can be extended “downward” by studying the neural realizations of simple schemas; it can be extended “upward” by recognizing that schemas have an external social reality in collective patterns of thought and behavior. Arbib claims that while schema theory can contribute to many open questions regarding the dependence of aspects of the person on the brain, Christian teaching parts company with science on the issue of the resurrection.

Arbib, Michael A. “Crusoe’s Brain: Of Solitude and Society."

In the first part of “Crusoe’s Brain: Of Solitude and Society,” Michael A. Arbib develops a thesis regarding social influences on brain function and hence on brain structure. Social schema theory attempts to understand how social schemas, constituted by collective patterns of behavior in a society, provide an external reality from which a person acquires schemas “in the head.” There is thus a top-down influence of social interaction on the microstructure of the brain through evolutionary processes, with brain action effectuated through perceptual and motor schemas. Conversely, it is the collective effect of behaviors that express schemas held by many individuals that constitutes and changes this social reality.

The learning of language provides an example of how individuals interiorize social schemas. Current research on mirror neurons (neurons that are active not only when an action is performed but also when the action is being perceived) provides a hypothesis that language specialization in humans derives from an ancient mechanism related to the observation and execution of motor acts. Arbib rejects Noam Chomsky’s hypothesis that language learning depends on innate universal grammar. Instead, based on work with Jane Hill, he argues that language in children begins with repetition of words and phrases, shaped by the use of very rudimentary grammatical schemas that develop by means of (“neo-Piagetian”) assimilation and adaptation. The richness of the metaphorical character of language can be interpreted in terms of schema theory: a word or phrase is an impoverished representation of some schema assemblage. Thus, extraction of meaning is a virtually endless dynamic process.

In the second part of Arbib’s essay he applies social schema theory to a discussion of ideology and religion. Social schemas include those that we take to be representations of the world, but others that we do not, such as ideals of human life that are never realized and models that are false but useful. While schema theory has no implications for the question of the existence of God, it does offer new and useful vocabulary for discussing the projection theory of religion, found already in the writings of Ludwig Feuerbach and Sigmund Freud. An ideology can be viewed as a very large social schema. It is, like language, something that the child comes to as an external reality and internalizes to become a member of society. While it is central to schema theory to analyze the mechanisms whereby social construction and reality depiction are dynamically interlinked, it is important to note that many “realities” are socially defined rather than “physical.” Thus, social schema theory provides a way of asking whether the “reality” of God is both external reality and social construction, or whether “God” is merely a social construct. Arbib suggests that the wide variation among religious beliefs argues for the latter conclusion. He offers this argument as an antidote to the “unabashedly Christian worldview of many other contributors to this volume, for whom the reality of divine action is taken as a given.”

Barbour, Ian. “Neuroscience, Artificial Intelligence, and Human Nature: Theological and Philosophical Reflections."

Ian Barbour, in “Neuroscience, Artificial Intelligence, and Human Nature: Theological and Philosophical Reflections,” develops a three-stranded argument, in which he sets out to show that it is consistent with neuroscience, computer science, and a theological view of human nature to understand a person as a multilevel psychosomatic unity who is both a biological organism and a responsible self. He considers the themes of embodiment, emotions, the social self, and consciousness.

Barbour surveys biblical and theological accounts of the person that emphasize the integration of body and mind, reason and emotion, individual and social groups. He then cites work by neuroscientists that highlights these same features, including Arbib’s action-oriented schema theory, LeDoux’s work on emotions, and Brothers’ work on the neural bases of social interaction. The ways in which computers fall short of human capacities provides additional insight into human nature: to approach the level of human functioning, computers require analogues to embodiment, learning and socialization, and emotion. The question of the possibility of consciousness in a computer is particularly problematic. Barbour shows that the concepts of information, dynamic systems, hierarchical levels, and emergence are valuable for integrating insights from neuroscience and AI research with that of theology in a theory of human nature.

Barbour argues that process philosophy provides a supportive metaphysical framework for understanding the concept of human nature that he has developed in this essay. Alfred North Whitehead’s philosophy emphasizes processes or events rather than substances. These events are all of one kind (thus, monism) but are all dipolar - they have both an objective and a subjective phase. Thus, in attenuated form, experience can be attributed not only to humans and animals, but also to lower forms of life, and even to atoms. In its own way, process philosophy emphasizes the same themes that Barbour traced through theology, neuroscience, and AI research. So Barbour concludes that a dipolar monism based on process philosophy is supportive of a biblical view of the human as a multilevel unity, an embodied social self, and a responsible agent with capacities for reason and emotion.

Brothers, Leslie A. “A Neuroscientific Perspective on Human Sociality."

In “A Neuroscientific Perspective on Human Sociality,” Leslie A. Brothers describes recent findings on the neural substrate of social behavior. A wide variety of evidence points to the role of the amygdala in processing information crucial for social interactions: (1) Monkeys with experimental lesions of the anterior temporal lobes (where the amygdala is located) have particular difficulty responding to the social signals of other monkeys. (2) Human patients with lesions of the amygdala have difficulty interpreting facial expressions, direction of eye gaze, and tone of voice. (3) Tests during neurosurgery show that a person’s ability to identify facial expressions can be disrupted by electrical stimulation to temporal-lobe regions. (4) Researchers studying vision in monkeys found temporal-lobe neurons that seemed to be responsive only to social visual stimuli such as faces. Brothers’ own research involved recording the activity of individual neurons in the region of the amygdala while monkeys watched video clips of other monkeys engaged in a number of activities. Her results showed that some nerve cells are particularly attuned to respond to movements that bear social significance, such as the specific yawn that males use to signal dominance.

The picture that is emerging from human and monkey studies, says Brothers, is that representations of features of the outside social world are first assembled in the temporal lobe cortices of the primate brain. Meaningful social events are registered when a host of signals and relevant contextual information are integrated. Our brains need to tell us the difference between someone approaching with friendly intent and someone whose aims are hostile, for example. The visual features of a face have to be put together to yield an image of a particular individual so that past interactions with this individual can be recalled. Next, movements of the eyes and mouth indicate the person’s disposition. Information from head position and body movement tell where this person is looking or going, providing raw material for the representation of a mental state such as his or her goal or desire. As these processes are taking place, the neural representation of others’ social intentions must be linked to an appropriate responsive behavior in the perceiver. Response dispositions should be set into play “downstream” from the temporal cortices, where face-responsive neurons have been found, in structures such as the amygdala. The amygdala, together with several other interconnected structures, receives sensory information and in turn projects directly to somatic effector structures such as the hypothalamus, brainstem, and primitive motor centers, making it a candidate for the link between social perception and response.

Brothers notes that human social interaction depends on the ability to employ the concept of person - a mind-body unit. What the research summarized here suggests is that the evolution of our brains has made it possible for us to construct and participate in the language-game of personhood; we have brains specially equipped for social participation.

Clayton, Philip, “Neuroscience, the Person, and God: An Emergentist Account."

Philip Clayton, in “Neuroscience, the Person, and God: An Emergentist Account,” provides a fine overview of issues spanning the range from neuroscience, through philosophy of mind, to theology. Beginning, as does Arbib, with a list of some of neuroscience’s achievements in understanding human phenomena, Clayton points out that either of two extreme positions, if true, would block any significant dialogue between the neurosciences and theology. On the one hand, strong forms of dualism that make mind into a separate substance remove mental phenomena forever from the realm of scientific study. On the other hand, eliminative materialism - the view that “folk” psychological entities such as beliefs and desires do not exist - resolves the debate with theology by removing theology altogether from the realm of possible theories. For this reason, Clayton’s essay challenges the “Sufficiency Thesis,” according to which neuroscientific explanations will finally be sufficient to fully explain human behavior.

Once these two extreme views, ontological dualism and radical reductionism, have been dismissed, a wide range of interesting possibilities remains for integrating neuroscientific results and theological interpretations into a theory of the person. One set of issues hinges on how one understands the epistemological status of theology. Clayton advocates the view that while religious beliefs are not subject to proof or confirmation from science, they need to be answerable to scientific advances in the weaker sense of not being counter-indicated by the empirical sciences.The more difficult issues have to do with interpretation of the results of neuroscience. It is clear that neural states are major determinants of subjective experience and thought, yet Clayton takes the “structural couplings” between the conscious organism and its environment, the phenomena of reference and meaning, and the experience of “qualia” (the subjective side of conscious experience) to suggest that mental events or properties are not thoroughly reducible to neural states. Clayton, along with others in this volume, understands mental events as supervenient on their physical substrates; however, along with Murphy, he challenges the standard accounts of supervenience that seem inevitably to result in causal reduction of the mental to the physical. Clayton’s version of “soft” or “emergentist” supervenience defines a property F as emergent if, and only if, there is a law to the effect that all systems with this microstructure have F, but F cannot, even in theory, be deduced from the most complete knowledge of the basic properties of the components of the system. If mental properties supervene on physical properties in this manner, Clayton concludes, there is room for genuine mental causation - not all causes of human behavior are purely neuronal causes.Clayton’s account of supervenience leads to an emergentist-monist account of the person: “monist” because, while there are many types of properties encountered in the world, there is only one natural system that bears all those properties; “emergentist” because, while mental phenomena result from an (incredibly complex) physical system, the brain, they represent a genuinely new causal and explanatory level in the world. He notes that emergentist monism is open to theological applications and interpretations, although it does not require a theistic outlook.

Ellis, George F. R. “Intimations of Transcendence: Relations of the Mind to God."

In “Intimations of Transcendence: Relations of the Mind to God,” George F.R. Ellis explores a strongly theistic interpretation of religious experience. He aims to show the logical coherence of a particular “kenotic” theological position, as well as its consistency with current views within both physics and neuroscience. After outlining the position he takes on fundamental issues such as the role of models in science, the hierarchical structuring of science, and the relations between causal explanations from different levels, Ellis presents a summary of the theological-ethical position that he developed with Nancey Murphy (Nancey Murphy and George F.R. Ellis, On the Moral Nature of the Universe (Minneapolis: Fortress Press, 1996)). “Kenosis” is a term from Christology that referred originally to Christ’s “emptying” himself of divine attributes. Ellis and Murphy extend the meaning of the term, using it to describe God’s loving self-sacrifice as revealed in the life and death of Jesus. On this view, kenosis is an overall key to the nature of creation because it is the nature of the Creator. A kenotic ethic of self-giving love reflects the ultimate nature and power of God, manifest most clearly in the resurrection of Jesus.

Evidence for the theological vision proposed here comes from a variety of human experiences, which Peter Berger has termed “intimations of transcendence.” Ellis argues that while there may be evolutionary or functional explanations of moral behavior, human creativity, aesthetic appreciation, love and joy, in all of these cases there seems to be an excess. Humans, for example, sacrifice themselves not only for kin but for strangers, and human love goes beyond the bounds of the practical. However, none of these intimations is sufficient to yield a detailed account of the nature of the transcendent. Thus, Ellis asserts the need for a channel of revelation. A major goal of this essay is to argue that a view of divine action (revelation) through the mediation of the human brain is consistent with contemporary neuroscience. He speculates that the causal gap revealed by quantum theory allows for a “causal joint” whereby information may be made available to human consciousness without violation of energy conservation. However, Ellis’s argument does not depend critically on the role of quantum phenomena in consciousness, but rather on the coherence and explanatory scope of the theological vision he proposes.

Green, Joel B. “Restoring the Human Person: New Testament Voices for a Wholistic and Social Anthropology."

New Testament scholar Joel B. Green was asked to write an essay reflecting current scholarship on biblical views of human nature. In “Restoring the Human Person: New Testament Voices for a Wholistic and Social Anthropology,” Green laments the fact that recent investigations of “biblical anthropology” have focused either on the question of body-soul dualism or on a series of topics oriented around human sin and its remedies. This is unfortunate because Scripture is largely unconcerned with speculative questions about human nature; the attempt to find answers to current philosophical questions can obscure the biblical writers’ own central concerns.

Green focuses his investigation on the writings of Paul and Luke - Paul because he is regarded as the most important theologian of the apostolic age; Luke because he is arguably the only Gentile author represented in the New Testament. This suggests that if any New Testament material were to reflect the dualism alleged to characterize Hellenistic thought, it would be that of Luke.Luke’s concern with human nature arises within the context of his understanding of salvation. Luke raises questions about what needs to be saved, and what “saved existence” would look like. Answers to these two questions point to his understanding of authentic human existence. Green examines as paradigmatic Luke’s narrative of Jesus’ healing of the woman with the hemorrhage (Luke 8:42b–48). Green finds here a holistic and social anthropology, evidenced by the fact that healing involves not merely reversal of her physical malady, but also restoration of her place in both the social world and the family of God. The importance of this (and other relevant texts) is to call into question two closely related tendencies in the twentieth-century West: to think of salvation fundamentally in “spiritual terms,” and, with respect to issues of healing and health, to think primarily in terms of bodies. Green supports his claim for the holistic and social anthropology of the Bible by examining other Lukan texts, as well as many of the Pauline and Genesis texts that have been used in the past to warrant body-soul dualism. In addition, he criticizes the popular “word study” method of biblical interpretation that has allowed body-soul dualism to achieve a prominence in Christian thought far out of proportion to the scriptural evidence. Such an approach too easily lends itself to reading contemporary meanings into biblical terms.Green concludes that Christians today who embrace a monistic account of humanity place themselves centrally within a biblical understanding. At the same time, he says, biblical faith resists any suggestion that our humanity can be reduced to our physicality. Furthermore, an account of the human person that takes seriously the biblical record will deny that human nature can be understood “one person at a time,” and will focus on the human capacity and vocation for community with God, with the human family, and in relation to the cosmos.

Hagoort, Peter “The Uniquely Human Capacity for Language Communication: From POPE to [po:p] in Half a Second."

Peter Hagoort specializes in the study of the neural underpinnings of language. In “The Uniquely Human Capacity for Language Communication: From POPE to [po:p] in Half a Second,” he points out that the sophisticated capacity for language unique to humans and performed in various forms such as speaking, listening, writing, reading, and sign language, rests on a tripartite architecture: coding for meaning, for syntax, and for sound structures. A central component of language skills is the mental lexicon, a part of declarative memory that stores the meaning, syntactic properties, and sounds of roughly 40,000 words.

Hagoort has studied the order in which information is retrieved from the mental lexicon - for example, when one recognizes the image of a well-known person. Words are not discrete units, each to be found localized in some small circuit in the brain; the various components of the ability to use words are all stored differently. First there is a conceptual selection and specification process, followed by retrieval of syntactic information, and then by retrieval of a sound pattern - all of this resulting in the utterance “pope.” The different retrieval processes occur with high speed, and are temporally orchestrated with millisecond precision.

One of the ways in which the sequence of events involved in word retrieval has been studied is by recording electrical brain activity, using a series of electrodes attached to the scalp. The brain regions involved (mainly in the left hemisphere) have been localized by means of neurological data and brain imaging techniques.

Hagoort notes that the understanding of the neural substrate of language is an essential ingredient in an understanding of the human person, not only because sophisticated linguistic ability is unique to humans, but also because language itself mediates our sense of self.

Happel, Stephen “The Soul and Neuroscience: Possibilities for Divine Action."

In “The Soul and Neuroscience: Possibilities for Divine Action,” Stephen Happel puts three notions into conversation with one another: Edmund Husserl’s philosophical interpretation of inner time-consciousness; Thomas Aquinas’s theological language of the soul; and contemporary neuroscientific analyses of human agency, memory, and bodily knowing.

Happel argues that medieval soul-language is not simply a devotional leftover from a discredited dualist substance philosophy. The concept of soul was a medieval attempt to explain the living experience of the cognitive, embodied subject. In his analysis of the role of the soul in human knowledge, Aquinas makes a variety of philosophical claims that are relevant to current research and discussion: First, human knowing is an active as well as a receptive process, dependent on the empirical world, yet critical in relationship to the world and to its own operations. Second, this knowing only takes place with the intimate cooperation of the individual’s body. Third, intelligence is open-ended; it wonders and inquires about everything within its horizon. Fourth, this intelligence can reflect upon itself. Fifth, open-ended human intelligence can go beyond the senses, intending and estimating, even understanding the reality of God. Sixth, human intelligence rightly apprehends reality through its senses and makes correct judgments on the basis of the evidence provided.

Time-consciousness is central to Husserl’s phenomenological description of human subjectivity. The agency of human consciousness is found in the retention, present awareness, and expectation that allow humans to be aware of temporally-extended objects of consciousness. There is a flow of interactions among memories, present consciousness, and future expectations that gives consciousness its unity. Happel shows that Husserl’s notion of subjective time consciousness coheres with Aquinas’s metaphysical vocabulary regarding intellectual powers: the world of interiority that Husserl examines turns from the consciousness of the subject to a self- reflexive knowledge of that subject; the unified body and soul, for Aquinas, becomes a self- conscious subject, examining itself introspectively.

Contemporary neuroscience examines time-memory, embodiment, and human initiative in the empirical subject, the knower who examines both self and the world through models and experiments. Reflecting on current theories of long-term and working memory, schema theory, somatic markers, and the hermeneutics of sense perception, Happel raises questions about human agency. He sees Husserl’s analysis of time-consciousness as a possible hypothesis for experiment and verification in the neurosciences, and he challenges the neurosciences to think about mind and consciousness not only as initiators but as radically open to their constitution as a social reality.

The examination of human consciousness in three major disciplines - philosophy, theology, and neuroscience - has as its goal the criticism of modern individualistic (solipsistic, autonomous) concepts of the human subject. Happel reasons that if the subject can be conceived as open to finite transcendence (that is, to the reality of the other in and to the subject) this should shed light on how God operates through the interaction of finite subjects in our world to bring about divine ends.

Jeannerod, Marc. ”The Cognitive Way to Action."

In “The Cognitive Way to Action,” Marc Jeannerod describes research on the generation of voluntary action. He begins with an historical overview of theories in the field. Already in the 1930s researchers noted that even the simplest movements produced by the nervous system of a frog appeared to be organized purposefully. So the question was, How are these actions represented in the brain? An important advance was the recognition that behavior is guided by internal models of the external world, with predictions built in as to how the external world will be modified by the organism’s behavior and how the organism itself will be affected by the action. The existence of such models is supported by ethological studies showing that certain behavioral sequences unfold blindly and eventually reach their goal after they have been triggered by external cues. Localized brain stimulation can also trigger similarly complex actions.

Early accounts hypothesized serial steps in the neural generation of actions. However, current brain studies suggest simultaneous activation in cortical and subcortical levels of the motor system. This distributed model of action-generation raises the issue of a central coordinator to determine the temporal structure of the motor output. The behavior of patients with damage to the frontal lobes suggests to Jeannerod that the supervisor system is associated with this region.

New light is now being shed on the problem of the neural substrates of action-generation by the study of mentally-simulation action. Jeannerod and his colleagues instructed subjects to imagine themselves grasping objects. Using Positron Emission Tomography (PET) and functional Magnetic Resonance Imaging (fMRI), they identified the cortical and subcortical areas involved. They were then able to show that forming the mental image of an action produces a pattern of cortical activation that resembles that involved in intentionally executing the action.

Jeannerod expects research such as this to shed light on the neural underpinnings of central aspects of the self such as intentionality and self-consciousness. He notes that there is now neuropsychological evidence for the moral dictum that “to intend is to act.”

Jeannerod, Marc “Are There Limits to the Naturalization of Mental States?"

In “Are There Limits to the Naturalization of Mental States?” Marc Jeannerod brings further neuropsychological research to bear on the topic of intentional action and its role in constituting self-awareness. He notes that humans are social beings, and that communicating with others is a basic feature of human behavior. A long-standing philosophical question is how it is possible for one person to recognize the mental states of others. A key insight here comes from neuroscience: the neural system one uses for detecting intentions of other agents is part of the neural system that generates one’s own intentions. Evidence for this comes from studies with monkeys showing the existence of neuronal populations in several brain areas that selectively encode postures and movements performed by conspecifics. Much of this population of neurons overlaps with those involved in the generation of the monkey’s own movements. This same sort of overlapping of function is suggested by PET-scan studies in humans. When subjects were told to watch an action with the purpose of imitating it, parts of the motor cortex were activated, whereas this was not the case if subjects were told to watch only for the purpose of later recognition.

The research summarized here sheds light on the problem of other minds, but in so doing raises a new philosophical problem: if the intention of another’s action is represented in my neural system by means of the same neural activity as my own intention to act, how does this intention get attributed to the right agent? Jeannerod shows that having a neural representation of an intention and attributing it to myself are two different processes, which are not automatically linked. Jeannerod reports further research that highlights this problem. Experimental situations have been devised in which it is not obvious to the subjects whether they are seeing an image of their own hand or that of the experimenter, moving in response to instructions. When the experimenter’s hand movement departed from the instructions, subjects had no difficulty recognizing it was not their own. But in thirty percent of cases when the experimenter’s hand followed the instructions, normal subjects mistook it for their own. Schizophrenic patients misattributed the experimenter’s movements to themselves eighty percent of the time. This is consistent with clinical reports that schizophrenics suffer from a tendency to incorporate external events into their own experience.

Jeannerod ends his essay with a reflection on the limits of human abilities to know other minds. A person’s individuality resides in the fact that no two individuals ever share all of the same experiences. Thus, no two people’s global neural states will ever be the same. If neuroscientific understanding is based on similar or identical neural representations, then some aspects of personal identity are beyond the realm of scientific inquiry.

Kerr, Fergus. “The Modern Philosophy of Self in Recent Theology."

“The Modern Philosophy of Self in Recent Theology” by Fergus Kerr is reprinted here from Kerr’s book, Theology after Wittgenstein (Fergus Kerr, Theology after Wittgenstein, 2nd ed. (London: SPCK, 1997), chap. 1)) because it ably demonstrates the extent to which Christian theology carries a “metaphysical load” - an account of the human person derived from Cartesian philosophy. It is ironic that the modern philosophical conception of the self sprang, as Kerr notes, from explicitly theological concerns. In the process of demonstrating the existence of God and the immortality of the soul, Descartes articulated a conception of human nature according to which the self is essentially a thinking thing, thus redefining what it is to be human in terms of consciousness. Descartes, together with Immanuel Kant, bequeathed a picture of the self- conscious and self-reliant, self-transparent and all-responsible individual, which continues to permeate contemporary thought even where Descartes’s substance dualism has been repudiated.

Kerr examines a number of authors to show how this picture of the self shows up in recent theology, and this despite the fact that some eminent theologians, such as Karl Barth and Eberhard Jüngel, have argued that the Cartesian turn to the subject has nearly ruined theology. Kerr considers the role of the Cartesian ego in the works of Karl Rahner, Hans Küng, Don Cupitt, Schubert Ogden, Timothy O’Connell, and Gordon Kaufman.

It is always as the cognitive subject that people first appear in Rahner’s theology. “Students alerted to the bias of the Cartesian legacy would suggest that language or action, conversation or collaboration, are more likely starting points.” Rahner’s theology depends heavily on the notion of self-transcendence: when self-conscious subjects recognize their own finitude, they have already transcended that finitude. This process of self-reflection produces a dynamic movement of “ceaseless self-transcendence towards the steadily receding horizon which is the absolute: in effect, anonymously, the deity.” While Kerr recognizes the theological payoff of this move, making arguments for the existence of God redundant, it is at the expense of an account of humans as “deficient angels” - that is, as attempting to occupy a standpoint beyond immersion in the bodily, the historical, and the institutional.

From his survey of Rahner and other examples, Kerr concludes that “in every case, though variously, and sometimes very significantly so, the model of the self is central to some important, sometimes radical and revisionary, theological proposal or program. A certain philosophical psychology is put to work to sustain a theological construction. Time and again, however, the paradigm of the self turns out to have remarkably divine attributes.” The philosophy of the self that possesses so many modern theologians is a view which philosophers today are working hard to destroy. Kerr’s essay ends with a brief survey of the post-Wittgensteinian philosophers who pursue this task - most notably, Bernard Williams and Charles Taylor.

LeDoux, Joseph E. “Emotions: How I’ve Looked for Them in the Brain."

Joseph E. LeDoux specializes in the use of animal models for studying emotion. In “Emotions: How I’ve Looked for Them in the Brain,” LeDoux describes his work on fear conditioning in rats. The rats are conditioned to associate a sound with a noxious stimulus. This sound then elicits the behavioral responses accompanying the emotional experience of fear: muscle tension, release of stress hormones, and so forth. Note that LeDoux distinguishes between the behavioral system and subjective feelings. It is the former, he argues, that should be seen as essential to understanding the function of emotions.

LeDoux uses a variety of techniques to relate fear behavior to specific circuits in the brain. First, lesion studies (selective damage to parts of the brain) and brain imaging techniques make it possible to locate the general regions involved. Next the circuits activated in fear responses can be followed by injecting tracer substances into those areas and recording the “firing patterns” of neurons in relation to various emotional states under a variety of learning paradigms. In this way, LeDoux has confirmed the crucial role of the amygdala, a distinctive cluster of neurons found deep in the anterior temporal lobe of each hemisphere. Inputs to the amygdala from sensory processes in the thalamus and cortex are key to processing fear stimuli, while projections from the amygdala to brainstem areas are involved in control of the behavioral, autonomic, and hormonal responses that constitute fear behavior. LeDoux notes that a variety of other brain systems are also involved in the various feeling states we term “emotions” in humans; only empirical research will show whether his work on fear generalizes to other emotions.

LeDoux, Joseph E. “Emotions: A View through the Brain."

In his second essay for this volume, “Emotions: A View through the Brain,” Joseph E. LeDoux provides an argument for his claim that the scientific investigation of emotion requires a distinction between emotional behavior or associated physiological responses and the subjective feelings experienced by humans. Emotional behavior can be understood by the evolutionist in terms of the function it serves in human and animal life. Emotional feelings must be seen as secondary since emotional behavior is present in organisms that do not have the capacity for conscious awareness. LeDoux defines emotional feelings as a result of sophisticated brains being aware of their own activities - in this case, being aware that an emotion system, such as the fear system, is activated. The problem of explaining emotional feelings is a part of the single problem of the explanation of consciousness. However, different emotional behavior or response systems may involve different brain mechanisms. Here LeDoux is critical of the “limbic-system” theory, a theory that sought to identify a single set of brain structures involved in all emotional responses.

LeDoux summarizes in more detail here the results of research on fear conditioning in rats (see above), and notes that studies of the effects of damage to the amygdala in humans, as well as fMRI studies, show that the amygdala is the key to the fear-conditioning system in humans, as well. However, the association of fear not with the original stimulus but with the environmental context in which the stimulus was encountered appears to depend on the hippocampus.

The persistence of learned fear responses is obviously valuable for survival. However, the inability to inhibit unwarranted fear responses can have devastating consequences, as in phobias and post-traumatic stress disorder. Thus, research on the probable role of neocortical areas in extinction of fear responses may be of great value in treating these disorders.

Less is known about other basic emotion such as anger or joy; it remains to be seen whether the amygdala is involved in these as well. Far less is known about “higher-order” emotions such as jealousy. And, as mentioned above, an account of emotional feelings awaits an adequate account of consciousness in general. However, LeDoux notes that working memory receives a greater number and variety of inputs in the presence of an emotional stimulus than otherwise, due to the variety of neural pathways involved; he speculates that this excess stimulation is what adds the affective charge to representations in working memory that we associate with felt emotions.

Meyering, Theo C. “Mind Matters: Physicalism and the Autonomy of the Person."

Theo C. Meyering, in “Mind Matters: Physicalism and the Autonomy of the Person,” takes yet a third approach to the issue of reduction. He states that “if (true, downward) mental causation implies nonreducibility [as Stoeger and Murphy argue] and physicalism implies the converse, it is hard to see how these two views could be compatible.” Meyering distinguishes three versions of reductionism: radical (industrial strength) physicalism; ideal (regular strength) physicalism, and mild or token physicalism. Radical physicalism asserts that all special sciences are reducible to physics in the sense that their laws can be deduced via bridge laws from those of physics. Ideal physicalism asserts that while it is practically impossible to reduce the special sciences, such reduction would be possible were there an ideally complete physics" (Note: This distinction parallels Stoeger’s recognition that epistemological reducibility is relative to the meaning of “laws of nature). Token physicalism is ontologically reductionist: there are no events that are not “token-identical” with some physical event or other (Note: See above, sec. 3.3). However, there are no identities between higher-level and lower- level types of events; consequently some events described by the special sciences have no physical explanation at all.

All of these reductionist positions are to be contrasted with compositional (milder than mild) physicalism, which asserts that some higher-level events are not even token-identical with physical events because the higher-level event (say, a crash in the stock market) is constituted by innumerable physical particulars in all sorts of states and interactions.

Meyering then surveys some of the existing arguments for the nonreducibility of the special sciences. One of the most important is the argument from multiple realizability. The claim is that economics, for example, is not reducible to physics because economic concepts (for example, monetary exchange) are “wildly multiply realizable” (for example, using coins, strings of wampum, signing a check). Thus, there can be no bridge laws and no reduction. Such an argument, however, only cuts against radical physicalism, not the weaker (and a priori more plausible) ideal physicalism.

A stronger argument for the indispensability of special-science explanations is based on the role of functional explanations. For example, the functional description of aspirin as an analgesic is in some instances a more useful explanation of its causal role (relieving a headache) than is its description as the chemical level.

Meyering’s own contribution focuses neither on multiple realizability of supervenient properties nor on multiple “fillers” of functional roles, but on “multiple supervenience.” In particular, a single subvenient state of affairs (for example, a cloud of free electrons permeating the metal of which a ladder is constructed) may realize a variety of supervenient dispositional properties (in this case, electrical conductivity, thermal conductivity, opacity). An explanation (say, of the cause of a deadly accident) requires reference to the dispositional property (electrical conductivity), not merely to the subvenient property. Meyering argues that it is this possibility of multiple supervenience, not multiple realizability, that gives arguments against reduction based on functional properties their real force. Downward causation, then, can be understood in terms of selective activation of one of several dispositional properties of a lower-level state, and thus can be assigned a stable place in our picture of how the world is organized without upsetting our conception of physics as constituting a closed and complete system of physical events.

Murphy, Nancey. “Supervenience and the Downward Efficacy of the Mental: A Nonreductive Physicalist Account of Human Action."

In “Supervenience and the Downward Efficacy of the Mental: A Nonreductive Physicalist Account of Human Action,” Nancey Murphy sets out to answer the question: If mental events are intrinsically related to (supervene on) neural events, how can it not be the case that the contents of mental events are ultimately governed by the laws of neurobiology? The main goal of her essay, then, is to explain why, in certain sorts of cases, complete causal reduction of the mental to the neurobiological fails. To do so, she first considers the concept of supervenience, offering a definition that runs counter to the “standard account.” The concept of supervenience was introduced in ethics to describe the relation between moral and nonmoral (descriptive) properties; the former are not identical with the latter, but one is a “good” person in virtue of possessing certain nonmoral properties such as generosity. Supervenient properties are multiply realizable; that is, (in the moral case) there are a variety of lifestyles each of which constitutes one a good person. Murphy criticizes typical attempts at formal definitions of “supervenience” for presuming that subvenient properties alone are sufficient to determine supervenient properties. She argues that many supervenient properties are codetermined by context - this move recognizes constitutive relationships not only at the subvenient level but also at the supervenient level itself or between the level in question and even higher levels of organization.

Murphy argues that it is this participation of entities in higher causal orders by virtue of their supervenient properties that accounts for the fact of downward causation. In Donald Campbell’s original example, it is the functional properties of the termites’ jaw structure - their relation to a higher-level causal order - that allows for environmental feedback, resulting in modifications at the (subvenient) genetic level. These modifications are a result of selection among lower-level causal processes (Note: While Murphy takes feedback and selection among lower-level causal processes to be the essential ingredient in downward causation, Arthur Peacocke, in his essay in this volume, assimilates it to “whole-part influence”).

Murphy then turns to the issue of mental causation: How do reasons get their grip on the causal transitions among neural states? The key to answering this question is the fact that neural networks are formed and reshaped (in part, at least) by feedback loops linking them with the environment; the environment selectively reinforces some neural connections but not others. Murphy points out that it is not only the physical environment that plays a downward causal role in configuring neural nets, but also the intellectual environment. It is the fact that mental states supervene, in Murphy’s sense of the term, on brain-states - that is, that they are co-constituted by both brain-states and their intellectual context - that makes the occurrence of the brain-states themselves subject to selective pressures from the intellectual environment.

Peacocke, Arthur. “The Sound of Sheer Silence: How Does God Communicate with Humanity?"

In “The Sound of Sheer Silence: How Does God Communicate with Humanity?” Arthur Peacocke advocates an emergentist-monist account of the natural world: its unity is seen in the fact of its hierarchical ordering such that each successive level is a whole constituted of parts from the level below. This world exhibits emergence in that the properties, concepts, and explanations relevant to higher levels are not logically reducible to those of lower levels. An emergentist- monist account of the human person fits consistently within this worldview. It is important to note that, unlike many philosophers, Peacocke does not identify mental properties with brain properties. Rather, he recognizes the mental or personal as an emergent level above the (purely) biological, and attributes mental properties to the unified whole that is the “human-brain-in-the- body-in-social-relations” (Note: Brothers and other authors in this volume would agree in emphasizing both the embodied and social character of mind and personhood).

More important than the logical irreducibility of levels in the hierarchy of complex systems is causal irreducibility. Peacocke discusses the concept of downward causation and a variety of related concepts of causal processes in complex systems, one of which is the distinction between structuring and triggering causes. A structuring cause is an ongoing state of a system (for example, the hardware conditions in a computer) that makes it possible for an event (the triggering cause; for example, striking a key) to have the effect that it does. Peacocke concludes that the term “whole-part influence” best captures what is common to all of these insights.

This essay elaborates on Peacocke’s earlier work on divine action, which regards the entire created universe as an interconnected system-of-systems, and adopts a panentheistic account of God’s relationship to the world such that God is understood as immanent within the whole of creation, yet the world is seen as “contained” within the divine. Thus, God’s action is to be understood on the analogy of whole-part influence.

The foregoing account of divine action lays necessary groundwork for an account of revelation: until we can postulate ways in which God can effect “instrumentally” particular events and patterns of events in the world, we cannot hope to understand how God’s intentions and purposes might be known “symbolically.” There are a variety of ways God is taken to be made known: general revelation through the order of nature; through the resources of religious traditions; through the “special revelations” that serve as the foundation of religious traditions; and in the religious experience of ordinary believers. While dualist anthropologies allowed for direct contact between God and the soul or spirit, Peacocke concludes that when the person is understood in an emergentist-monist way it is more consistent with what we know of God’s relation to the rest of creation to suppose that God’s communication is always mediated, even if only by affecting the neural networks that subserve human memories and other sorts of experience, including the feeling of God’s presence. Thus, all of these forms of revelation can be understood as the result of God acting through the mediation of the human and natural worlds.

Peters, Ted. “Resurrection of the Very Embodied Soul?"

In “Resurrection of the Very Embodied Soul?” Ted Peters argues that the Christian understanding of eternal salvation is not threatened by the rejection of substance dualism. In fact, the rejection of dualism by both the cognitive neurosciences and the Christian tradition represents an important area of consonance between theology and science - namely, that human reality is embodied selfhood. Peters notes that this issue deserves attention because some theorists, in both cognitive science and philosophy, claim two things: first, the findings of the neurosciences regarding the brain’s influence on the mind demonstrate that the human soul cannot be thought to exist apart from a physical body and, second, that this physicalist interpretation so undermines the doctrine of the immortal soul that the Christian view of eternal salvation becomes counter- scientific.

Peters points out that until recently theologians have not been forced to clarify the distinction between two overlapping ways of conceiving personal salvation: One, rooted primarily in the ancient Hebrew understanding, pictures the human person as entirely physical, as dying completely, and then undergoing a divinely effected resurrection. The other, a later view influenced by Greek metaphysics, pictures the human person as a composite of body and soul; when the body dies the soul survives independently until reunited with a body at the final resurrection. In both pictures, however, the resurrection of the body is decisive for salvation. Now, however, to the extent that the dualistic vocabulary and conceptuality inherited by Christian theology from the Platonic tradition begins to look too much like Cartesian substance dualism, theology is in error.

In approaching the constructive question of how best to relate cognitive theory and theology, Peters first examines and rejects two “blind alleys”: the notion of the “humanizing brain” developed by James Ashbrook and Carol Albright, and the artificial intelligence model of the human soul as disembodied information processing developed by Frank Tipler. In contrast to Tipler’s view, Peters notes that belief in the resurrection, for Christian theology, does not depend on any natural process identifiable by science or philosophy, but on the witnessed resurrection of Jesus Christ at the first Easter. The Christian promise points toward an eschatological transformation - a new creation - to be wrought by God. Peters follows Wolfhart Pannenberg in connecting the resurrection to God’s eschatological act wherein time is taken up into eternity, and wherein God provides for continuing personal identity even when our bodies disintegrate.

Stoeger, William R., S.J. “The Mind-Brain Problem, the Laws of Nature, and Constitutive Relationships."

In “The Mind-Brain Problem, the Laws of Nature, and Constitutive Relationships,” William R. Stoeger, S.J. argues that a correct understanding of the meaning of “laws of nature” is essential for clarifying issues associated with the mind-brain problem. He distinguishes between “the laws of nature” as the regularities, relationships, and processes that obtain in nature, and “our laws of nature” as our provisional, incomplete, and imperfect models of these regularities. In some areas of science our models give fairly adequate accounts of the actual regularities and relationships; in others adequate models are still lacking. Modeling mental processes and their relations to brain processes seems especially problematic due to the subjective and holistic character of mental phenomena; in fact, it is not yet clear what would count as an adequate model for explaining the mental in terms of brain processes.

The sense of “laws of nature” that one intends has a bearing on the meaning of essential terms in the philosophy of mind, such as “emergence” and “supervenience,” and on an even deeper issue underlying the mind-brain problem: the very meaning of “physical” or “material,” versus “nonphysical” or “immaterial.” “Matter” is not a scientific term and the meaning of “material” is historically contingent. In common usage, Stoeger takes it to refer to that which we can model, describe, and understand using the resources of the natural sciences. Correlatively, the immaterial is that which transcends the regularities known by science. Thus, the identification of the mental with the immaterial does not mean that the mental could not be a property of neurologically highly organized matter.Stoeger draws attention to the “constitutive relationships” that account for the hierarchical structure of reality, such that higher levels are composed by complex ordering of lower-level entities. The constitutive relationships of a complex whole are all of those connections, relationships, and interactions that either incorporate its lower-level components into that more complex whole, relate that whole to higher-level unities in such a way as to contribute essentially to its character, or maintain its connection to the Ground of its existence. Stoeger’s insight is that insofar as there are constitutive relationships of the sort that relate an entity to higher-level systems, those entities are not reducible either causally or mereologically (that is, as mere aggregates are reducible to their parts). Thus, Stoeger concludes that mental states cannot be reduced to brain-states: there are constitutive relationships not just among the brain-states that realize them, but also relating the mental states they determine with one another and with historical and environmental conditions. These external constitutive relations play a role in determining the sequences and clustering of mental states.Stoeger ends by reflecting on the Aristotelian and Thomist accounts of form and soul as that which makes an entity to be what it is. He notes that a scientifically accessible correlate of these notions is his own account of constitutive relationships.

Watts, Fraser. “Cognitive Neuroscience and Religious Consciousness.”

In “Cognitive Neuroscience and Religious Consciousness,” Fraser Watts notes that when divine action is considered in relation to the physical sciences the rationality of Christian faith may be at stake, but when God’s action is considered in relation to the cognitive neurosciences the credibility of daily religious life and practice may be at stake as well: How do humans relate to God as persons who are not mere minds, souls, spirits? Two major issues raised are the validity of revelation and the nature and possibility of religious experience.

There are both scientific and theological reasons for attending to the brain when attempting to understand religious experience. However, Watts resists the question of whether religious experience is caused by the brain or by God. Theological and neurological explanations are complementary; one is free to privilege the level of explanation that is most relevant in a particular context.Watts considers two developments in attempts to understand the involvement of neural processes in religious experience. The first is based on claims that temporal lobe epilepsy (TLE) patients have more religious preoccupations than others; this has given rise to the further claim that religious experience should be linked with the neural basis of TLE. However, Watts disputes both the data and this interpretation. A second attempt to link religious experience and the cognitive neurosciences is that of Eugene d’Aquili and colleagues. Watts finds this research of more interest in that it involves a somewhat more sophisticated theory of religious experience and ties it to a theory of more general cognitive functioning - d’Aquili’s theory of “cognitive operators.”Watts’s own thesis is that a truly adequate cognitive theory of religious experience would benefit from attention to analogies between religious and emotional experience. The most valuable cognitive theories of emotion are multi-level, for example, distinguishing the sensory-motor aspects from the interpretation of the experience, and further distinguishing between intuitive perceptions of meaning and the ability to describe the experience propositionally. Watts speculates that this latter distinction, in particular, will shed light on the phenomena of religious life.An attempt to understand the role of God in religious experience will be hampered, according to Watts, by too narrow a focus on “divine action.” Any analogy with human “action” needs to be balanced with other metaphors that keep before our mind the fact that God’s action is constant rather than episodic, interactive rather than controlling. He suggests the concept of “resonance” or “tuning” as an image for understanding the divine-human interaction. Conscience might then be understood in terms of resonance with the will of God.

Wildman, Wesley J. and Leslie A. Brothers. “A Neuropsychological-Semiotic Model of Religious Experiences."

In “A Neuropsychological-Semiotic Model of Religious Experiences,” Wesley J. Wildman and Leslie A. Brothers observe that the neurosciences have largely succeeded, through their analyses of brain structure and function, in portraying that which is distinctively human as continuous with the laws and forms of complexity observed throughout the natural world. This generally accepted conclusion about human beings reconfigures the whole theory of religious experience by proposing explanations for them that are independent of the assumption that they are experiences of anything properly called a religious object. This reductionistic challenge is not different in philosophical terms from earlier challenges, but it does invite theories of religious experience that attend to the neurosciences.

As Fraser Watts points out in his essay, religious experience is notoriously difficult to define and delimit. Wildman and Brothers choose the term “experiences of ultimacy” both to focus their study on a subset of the broader category of religious experience, and also to avoid prejudicing their treatment in favor of theistic religions that focus on (putative) experiences of God.

The goal of this essay, then, is to present a richly textured interpretation of experiences of ultimacy. The authors develop this interpretation in two phases. First, they describe these experiences as objectively as possible, combining the descriptive precision of phenomenology, informed by the neurosciences, with a number of more obviously perspectival insights from psychology, sociology, theology, and ethics. Their hope is that the resulting taxonomy will be compelling enough to support constructive efforts in theology and philosophy that depend on an interpretation of religious experience - including those in this volume that attempt to speak of divine action in relation to human consciousness (Note: See especially the essays by Watts, Peacocke, and Ellis).

The authors make two constructive ventures on the basis of this description. In the first, inspired by existing social processes used to identify authentic religious experiences, they describe a procedure whereby genuine experiences of ultimacy can be distinguished from mere claims to such experiences. They recognize a variety of “markers” that together point toward authenticity: subjects’ descriptions (considered within their socio-linguistic contexts), phenomenological characteristics, judgments by experts in discernment or psychology, conformity with theological criteria, and ethical transformation. Judgments of this sort bring such experiences into the domain of public, scientific discussion as much as they can be, and the authors speculate that this will encourage more mainstream discussion of such experiences by scientists and others.

The second constructive venture is the authors’ attempt to evaluate claims made concerning the cause and value of experiences of ultimacy. The modeling procedure they adopt makes use of semiotic theory to plot the “traces” of causal interactions in the form of sign transformations, though not the causal interactions themselves. In the language of semiotic theory, these causal traces take the form of “richly intense sign transformations.” This proposal keeps ontological presuppositions to a minimum by focusing on causal traces rather than the causes themselves. Nevertheless, the authors contend, it does offer a religiously or spiritually positive way of interpreting authentic ultimacy experiences. At the end of the essay the authors offer a suggestion about the nature of the ultimate reality that might leave such causal traces.

Quantum Cosmology and the Laws of Nature

This collection of seventeen research papers is the result of the first of five international research conferences co-sponsored by the Vatican Observatory in Rome and the Center for Theology and the Natural Sciences.  The papers explore the implications of quantum cosmology and the status of the laws of nature for theological and philosophical issues regarding God’s action in the world.  After an introduction to the Big Bang, inflation and quantum cosmology, these essays respond to a series of questions including:  What methodological issues are raised by the interaction between science and theology?  What is the status of the laws of nature and how does this relate to divine action?

Alston, W. “Divine Action, Human Freedom, and the Laws of Nature.”

In his paper Bill Alston studies the philosophical aspects of the problem of divine action in relation to both the laws of nature and the meaning of human freedom.  To set the stage, he begins by stating two presuppositions which characterize his general approach:  first, he takes “seriously and realistically” the idea of God as a personal agent, and second, God’s activity extends beyond creation and conservation to include special acts performed by God in light of knowledge of the world and to achieve a purpose.

By “seriously” Alston means that, at least in some cases, we understand the statement “God acts” literally and not just figuratively.  By “realistically” Alston means that religious discourse, along with scientific discourse, aims at “an accurate portrayal of an independently existing reality” with objective characteristics.  God’s actions result in outcomes, at least on some occasions, which are different from the outcomes that would have been if only natural factors had been at work.  God’s acts include not only revelation but also such “super-spectacular miracles” as the parting of the Sea of Reeds and the resurrection, as well as daily divine-human interaction, in prayer for example.  Thus God acts as a personal agent, “possessed of intellect and will.”

With this as background, Alston proceeds to the main burden of the paper, relating his convictions to the topics of natural law and human freedom.  He defines “determinism” as the doctrine that “every happening is uniquely determined to be just what it is by natural causes within the universe.”  Although determinism has a strong hold on contemporary culture, Alston takes quantum mechanics to provide a “definitive refutation” of it.  Hence because of quantum indeterminism, God can act without violating physical law.  Moreover, acts such as these which begin on the sub-atomic level can lead to differences in macroscopic states.  It is thus possible that God designed the universe in this way to allow for divine action. 

Still, Alston’s main point does not depend on the indeterminacy of quantum mechanics.  According to Alston, even deterministic laws only provide sufficient conditions for predicting the behavior of closed systems.  But we never have reason to believe a system is actually closed, i.e., that we know all the operative forces at work.  Any system can be open to outside influences, including the acts of God.  Hence, in this more general sense, God’s acts do not violate natural law regardless of whether these laws are probabilistic or deterministic.

Next Alston turns to the problem of human freedom.  He takes a libertarian view of free action in which “nothing other than my choice itself uniquely determines me to choose one way rather than another.”  Does human freedom pose a problem for conceptualizing God’s action?  Alston first argues that, with the exception of acts by free creatures, all events which we attribute to God’s specific acts could in fact be the unfolding of what God designed initially (whether “initially” means temporally first, as in a universe with a first moment, or first in order of priority, as in a universe with an infinite past).  But if we assume that humans, at least, have libertarian free choice, the strategy of initial design cannot work - unless God can be said to possess “middle knowledge,” defined as knowledge of what (actual and possible) free agents would choose to do in any situation in which they found themselves.  Alston then argues that middle knowledge is impossible:  God cannot know what a free agent would decide in situations which the agent never actually encounters.  Thus if there is libertarian free agency and if middle knowledge is impossible, we must conclude that those of God’s acts that appear to take place in time in response to the choices of free agents do indeed take place in precisely that way, and not merely by means of God’s initial design.

Alston then turns to physical cosmology and its possible bearing on divine action.  The choice between cosmological models such as the Big Bang, the oscillating universe and inflation makes little difference to Alston’s position in general, though the status of time in these models might be significant.  To pursue this question, Alston distinguishes between a block universe view of time and the process view of time.  Does human freedom require the process view, or is the block view sufficient?  Alston argues that, although the latter denies the passage of time, it does not imply that all events exist at all times, but only that they each exist at their own time.  Thus the block view does not undercut human freedom, since all that is required for an act to be free is that it not be determined by anything prior to it, and this is possible even on the block view.  Similarly God knows each event in its own time. 

Finally, what about God’s existence:  is it temporal or atemporal?  Alston argues that even if God is atemporal, God’s acts can produce temporally ordered consequences in the world.  Moreover, relativity theory and quantum cosmology suggest that time should not be viewed as a “metaphysically necessary form of every kind of existence, including the divine existence.”  Thus physical cosmology, and with it the status of time, has little bearing on how we should best think of divine action in the world and its relation to the laws of nature and human freedom.

Davies, Paul C. W. “The Intelligibility of Nature."

Paul Davies begins with the claim that our ability to understand nature through the scientific method is a fact which demands an explanation.  He proposes that our mind and the cosmos are linked, that consciousness “is a fundamental and integral part of the outworking of the laws of nature.”  In particular, the laws of nature which make possible the emergence of life must be of a form such that at least some species which arise according to them have the ability to discover them.  Thus science can explain the rise of species which can engage in science without appealing to a God who either intervenes in or guides nature.  Still the ultimate explanation of the origin of the laws lies outside the scope of science and should be pursued by metaphysics and theology.  Whether this leads to God “is for others to decide.”

Davies begins with the sociological debate over the origin of science.  Although science is clearly a product of Western European culture, he sees no simple relationship between Christian theology and the emergence of science.  Whatever its origins, though, the validity of science is transcultural and warrants a realist interpretation. 

What is most significant about nature is that the universe is “. . . poised, interestingly, between the twin extremes of boring over-regimented uniformity and random chaos.”  Accordingly it achieves an evolution of novel structures through self-organizing complexity.  “The laws are therefore doubly special.  They encourage physical systems to self-organize to the point where mind emerges from matter, and they are of a form which is apprehendable by the very minds which these laws have enabled nature to produce.”  Does our ability to “crack the cosmic code” lead to an argument for God?  No; Davies prefers an evolutionary interpretation of mind as emergent within the material process of self-organization.  The emergence of mind with its ability to pursue science is not just a “biological accident.”  Instead it is inevitable because of the laws of physics and the initial conditions.  Hence life should emerge elsewhere in the universe - a claim which Davies sees as testable.

If mind emerged because of the laws of nature, is it surprising that mind is capable of discovering these laws?  Davies first stresses that evolution is a blend of chance and necessity; it is neither teleological nor is it a “cosmic anarchy.”  The laws “facilitate the evolution of the universe in a purposelike fashion.”  Still the actual laws of the universe are remarkable.  They not only encourage the evolution of life and consciousness but they support the evolution of organisms with the ability for theoretical knowledge.  Here the ability to do mathematics is particularly surprising.  Davies connects mathematics with the physical structure of the world through “computability” and thus to physics, since computing devices are physical.  In this way mathematics and nature are intertwined.  Moreover, mathematics is capable of describing the laws of physics which govern the devices which compute them. The intimate relation of mind and cosmos need not lead to a theological explanation, but Davies is equally critical of a many- universe explanation, opting instead for a form of design argument.  This, however, takes us to the limits of science.  “The question of the nature of the laws themselves lies outside the scope of the scientific enterprise . . . (and) belongs to the subject of metaphysics . . .” 

Drees, Willem B. “A Case Against Temporal Critical Realism? Consequences of Quantum Cosmology for Theology."

It is Wim Drees’ argument that, unlike other topics in science, Big Bang and quantum cosmology are equally compatible with a timeless, static view of the universe and a temporal, dynamic view.  Critical realists must face this ambiguity squarely.  If they want to make the ontological claim that nature is temporal they must relinquish the epistemological claim for the hierarchical unity of the sciences by leaving out relativity and cosmology (due to this ambiguity).  Yet this is problematic to realists, for where both ontological and hierarchical claims are pivotal.  Drees himself sees the timeless character of cosmology as more compatible with a Platonizing tendency in theology, in which God is timelessly related to the world rather than temporally related via specific divine acts in the world. 

Drees begins by defining “temporal critical realism” as the combination of critical realism with an evolutionary view of the world and a temporal understanding of God.  The Big Bang appears at first to offer just such a highly dynamic worldview consistent with temporal critical realism.  However its underlying theories (special relativity and general relativity) undercut this dynamic perspective, challenging universal simultaneity and re-interpreting time as an internal, rather than an external, parameter.  Quantum cosmology, and its underlying theory, quantum gravity, further challenge the dynamic view of nature.  Although they overcome the problem of the singularity at t=0, quantum cosmology and quantum gravity offer an even less temporal view of nature than does relativity, since they move from a four-dimensional spacetime perspective into a three-dimensional, spatial perspective in which time plays a minimal role at best. 

Critical realists such as Barbour, Peacocke and Polkinghorne have been careful to avoid theological speculations about t=0, recognizing that its status is controversial and subject to the shift in theories.  However, they have not been equally attentive to the challenge to temporality per se by special relativity and general relativity, let alone by quantum cosmology and quantum gravity.  Moreover, Drees claims the latter ought not be dismissed merely because they are speculative.  Such a strategy to insulate temporal critical realism is ad hoc, since temporal critical realists are already committed epistemologically to a hierarchical unity of the sciences, and thus changes - even if only potential ones - at the fundamental level of the hierarchy carry enormous epistemic leverage.  For its part, the timeless character of physics and cosmology leads us to view God in more Platonic terms.  Drees explores this option in some detail, including the problem of divine action, the arguments for viewing God as an explanation of the universe, and the constructivist view of science as myth.  He concludes by suggesting that axiology may be a more apt focus for theology than cosmology, and this in turn would lessen the impact science has on theology. 

Ellis, George F. R. and William R. Stoeger. “Introduction to General Relativity and Cosmology."

George Ellis and Bill Stoeger present the standard model of our universe, the Friedmann- Lemaitre-Robertson-Walker (FLRW) or Big Bang model, including its foundations in special relativity and general relativity, and its observational features.  They also discuss inflationary models which solve many of the problems of the standard model.  Attention is given to the assumptions which go into cosmology, some of which are technical, others more philosophical in nature.  In a key passage, Ellis and Stoeger focus on the initial singularity, t=0, “the boundary to the universe where the laws of physics break down . . .”  Is t=0 a “creation” event, or is it a transitory feature of the standard cosmological model which will disappear when a quantum theory of gravity is achieved?  They also touch on questions of the origin of structure, complexity and life in the universe.

Ellis, George F. R. “The Theology of the Anthropic Principle."

George Ellis’s paper combines reflections on the Anthropic Principle with the theology of William Temple.  He calls this a “Christian Anthropic Principle,” which seeks to account for the particular character of the universe in terms of the design of God who intends the evolution of creatures endowed with free will and the ability to worship the Creator.  Ellis thereby hopes to provide a synthesis of science and theology which will take into account recent work in cosmology and provide a better understanding of how these two fields might be related. 

Ellis begins by distinguishing between the patterns of understanding in science and in theology.  Still, both religion and science can be relevant when we consider the nature of the universe and its ultimate cause.  Five approaches to such a cause are available:  random chance, which is unsatisfactory unless one accepts reductionism; high probability as in chaotic cosmology, which is hard to quantify; necessity (only one kind of physics is consistent with the universe), but since the foundations of the sciences are debatable, an argument from the unity of the sciences is far from available; universality (all that is possible happens), but such Many Worlds arguments are controversial and probably untestable; and design of the laws of physics and the choice of boundary conditions.  Design requires a transcendent Designer.

The Anthropic Principle (AP) speaks to two questions:  why do we exist at this time and place (Weak AP), and why does the universe permit evolution and our existence at any time or place (Strong AP)?  The Strong AP can be linked to quantum mechanics through the role of the “observer,” but this is controversial and it leaves unanswered the question of why quantum mechanics is necessary.  Thus Ellis looks for ultimate causes beyond the confines of science.  Religion can provide just such an approach, since it is capable of dealing with ultimate causation without being incompatible with science.

Ellis provides a Christian setting for the design argument by describing the “essential core” of New Testament teaching based on Temple’s theology and his own Quaker perspective.  God is understood as creator and sustainer, embodying justice and holiness.  God is personal, revealed most perfectly in Jesus, and active in the world today.  The Kingdom is characterized by generosity, a forgiving spirit and loving sacrifice.  The universe arose as “a voluntary choice on the part of the Creator, made because it is the only way to attain the goal of eliciting a free response of love and sacrifice from free individuals.”

This interpretation of divine action guides Ellis in his proposal of a Christian Anthropic Principle (CAP), combining design with divine omnipotence and transcendence.  The nature, meaning and limitations of creation are determined by the fundamental aim of God’s loving action, that of making possible in our universe the reality of sacrificial response.  God’s design, working through the laws of physics and chemistry, allows for the evolution of such modes of life in many places in the universe.  “From this viewpoint, fine-tuning is no longer regarded as evidence for a Designer, but rather is seen as a consequence of the complexity of aim of a Designer whose existence we are assuming . . .”

This entails five implications for the creation process.  The universe must be orderly so that free will can function. God attains this goal through creating and sustaining the known physical laws which allow for the evolution of creatures with consciousness and free will.  God has also given up the power to intervene directly in nature.  The existence of free will makes pain and evil inevitable and requires that God’s providence be impartial.  Moreover, God must remain hidden from the world, allowing for “epistemic distance.”  God achieves both an impartial providence and epistemic distance through the impartiality of the laws of nature.  Yet revelation must be possible, so that God can disclose to the faithful an ethical basis for life.  None of this contradicts the standard scientific understanding of the universe, but adds an “extra layer of explanation” for the universe and its laws that is basically metaphysical.  Finally, Ellis turns to quantum indeterminacy to provide a basis for divine inspiration.  Other forms of intervention or action are thus excluded, including amplification by chaotic systems; these would “greatly exacerbate the problem of evil.” 

While Ellis has argued that it is highly probable that life exists throughout the universe, he claims that the number of individuals in the universe must be finite if God is to be able to exercise care for each.  If the universe were infinite instead, it would not be possible for God to have the requisite knowledge of the infinite number of individuals and their infinite number of relations to one another.  Thus the SETI project is of “tremendous religious significance” in testing the hypothesis of a caring creator.

CAP leads us to the following questions:  is our physical universe the only way to achieve the divine intention?  how, more precisely, is the ultimate purpose imbedded in, and manifested by, the laws of physics?  what proof can be given for CAP?  To the latter question Ellis argues that the evidence for CAP is stronger than evidence for inflation or the quantum creation of the universe. 

Grib, Andrej A. “Quantum Cosmology, the Role of the Observer, Quantum Logic."

The central argument in Andrej Grib’s paper is that temporal existence, i.e., movement through time, allows the human mind to obtain information about the universe governed by quantum physics.  The universe is characterized by non-standard logic (non-Boolean logic), whereas we interpret the universe in terms of ordinary (Boolean) logic.  Thus we must experience events successively as past, present and future in order to gain knowledge of the objective but incompatible (non-commuting) character of nature.  Using this basic argument, Grib speculates about World Consciousness and suggests how quantum cosmology can provide plausibility arguments for Orthodox Christian theology.

Grib begins by describing several key interpretations of quantum physics.  According to Niels Bohr, complementary quantum phenomena lack independent reality since the measurement apparatus and the quantum object form an indivisible whole.  John Von Neumann interpreted the collapse of the wave packet during measurement as a process without a physical cause.  Instead it is due to the consciousness of the observer, conceived as an abstract self.  For Fritz London and Edmond Bauer the key feature of consciousness is introspection, not abstract ego, giving the problem a more objective character.  Eugene Wigner developed this argument further by proposing that any living system could have the capacity to collapse the wave function.

The next step in Grib’s account was taken by John Bell whose famous theorem forces us to chose between idealism (in which quantum objects with non-commuting properties only exist when observed) and realism (in which quantum objects with non-commuting properties have a qualified existence independent of observation).  Still the latter is far from ‘naive realism’ for, even admitting the qualified existence of quantum objects (objects characterized by non-Boolean logic), the existence of macroscopic objects in nature (objects characterized by Boolean logic) is the direct result of observation.

Grib now turns to the problem of quantum cosmology where, as a quantum theory, one faces the fundamental problem of measurement:  who is the observer?  Clearly to speak here about a “self-originating universe” and an “objectively contingent universe” is misleading, since the existence of the universe per se now requires an “external observer.”  Grib proposes that we opt instead for the qualified existence afforded by quantum logic and apply this to cosmology.  Thus, given that quantum logic involves conjunctions and disjunctions which do not satisfy the distributivity law, in order for our (Boolean) minds to grasp the (non-Boolean) quantum cosmos we must experience the world through temporal sequence.  Each event in the sequence is a different Boolean substructure (which is accessible to us) of the overall non-Boolean universe (which is not accessible).  Thus the temporality of the world arises out of our mental processes, which integrate our Boolean experiences with physical, non-Boolean, structures. 

Happel, Stephen. “Metaphors and Time Asymmetry: Cosmologies in Physics and Christian Meanings."

For Stephen Happel, the methodological bridge between theology and science comes through language - in particular, through metaphor.  Happel argues that scientists, as well as theologians, use ordinary speech, constructed of metaphors, to originate, process and communicate their insights.  By studying how scientists employ metaphors, theologians can discover an important role for cosmology in religious discourse.  Similarly scientists can gain from understanding the hermeneutical framework they employ in their discourse. 

Happel begins by studying the general reasons for focusing on metaphor in both fields, namely the conviction that science, like theology, is a hermeneutical venture.  Next he focuses on the metaphors that adorn cosmology and the ways they narrate the story of the universe.  Happel then argues that there is a basic relationship between the particular metaphors chosen by cosmologists and the actual temporal asymmetry of the universe.  To support his claim, Happel critically evaluates four competing theories about the nature of metaphor.  In the process he argues that in both science and religion metaphors communicate more than feelings:  they indicate a state of affairs.  Because of this, Happel is critical of Paul Ricoeur’s paradoxical “is/is not” view of metaphor and of theologians who appropriate it, preferring instead the moderate realism advanced by Mary Hesse and Bernard Lonergan. 

Next Happel attempts to understand why some scientists hold an atemporal perspective on cosmology while others see the universe in terms of a temporal narrative such as the Big Bang.  The difference is due less to physics or mathematics, Happel argues, than to the presuppositions one holds about metaphor.  Atemporalists tend to have a Ricouerian view of metaphor as paradoxical, whereas temporalists tend toward a moderately realist theory of metaphor.  Lonergan’s theory and its relation to emergent probability as the explanation of temporality provide just the needed ontological reference for the metaphors of time asymmetry.

Happel acknowledges that deconstructionists, drawing heavily on the writings of Jacques Derrida, propose an alternate theory in which the surplus of metaphors is sheer play without directionality or finality.  He contrasts the teleological approach of scientists like John Barrow and Paul Davies, who treat cosmological metaphors as narrative, with the non-teleological approach of Stephen Hawking, who seems closer to Derrida and Ricoeur.  Happel sees these approaches as parallel to the theological typologies of prophecy (narratives which stress the ethical imperative) and mystical communion (non-narrative descriptions of the atemporal identity with the divine).  Both narrative and non-narrative languages are integrally intertwined in the doctrine of the Trinity.  As a result, theologians might investigate how narrative and non-narrative interpretations of spacetime might be “equiprimordial, requiring a co-implicating dialectic.”  Scientists, in turn, are encouraged to develop an approach which overcomes the symmetrical-asymmetrical arguments concerning temporality.  Ultimately, whether the universe is seen in mystical or prophetic terms, God’s involvement is to be trusted, and God’s gift of time leads us to combine narrative and non-narrative language.

Heller, Michael. “On Theological Interpretations of Physical Creation Theories."

How should scientific theories be interpreted philosophically and theologically?  According to Michael Heller there are three possibilities:  (a) pseudo-interpretations which contradict physical theory; (b) consistent interpretations which are neutral with respect to the theory’s mathematical structure; (c) exegetical interpretations which restate the mathematical structure in everyday language and are in strict agreement with the theory.  Heller is highly critical of (a).  Moreover, since physical theories as such “say nothing about religious matters” (c) is ruled out.  Therefore, to be valid, an interpretation must be of type (b) and should be taken “seriously but not literally.”  Heller adds that science may also serve as a source of insight and metaphor for theology, or provide a suitable context for theological reformulation.  Still, theology should interact with the overall scientific image of the world and not with a particular theory.  He also suggests that physical theories, with the addition of specific premises, can provide grounds for theological conclusions, and that theological and scientific theories might have mutually dependent or even equivalent implications.  Finally, the very existence of successful physical theories poses an important philosophical problem.

Isham, C.J. and J.C. Polkinghorne. “The Debate over the Block Universe."

Is ours a world of timeless being (the “block universe”) or of flowing time and true becoming?  The current debate over the block universe, represented in the essay by Chris Isham and John Polkinghorne, brings together scientific, philosophical and theological arguments in a tightly-knit, interwoven pattern. 

Proponents of the block universe appeal to special and general relativity to support a timeless view in which all spacetime events have equal ontological status.  The finite speed of light, the light cone structure, and the downfall of universal simultaneity and with it the physical status of “flowing time” in special relativity result in a heightened tendency to ontologize spacetime.  The additional arbitrariness in the choice of time coordinates in general relativity makes flowing time physically meaningless.  Thus no fundamental meaning can be ascribed to the “present” as the moving barrier with the kind of unique and universal significance needed to unequivocally distinguish “past” from “future.”  Instead the flowing present is a mental construct, and four-dimensional spacetime is an “eternally existing” structure.  God may know the temporality of events as experienced subjectively by creatures, but God cannot act temporally, since flowing time has no fundamental meaning in nature.  Theologians must accept the Boethian and even gnostic implications of the block universe. 

Opponents of the block universe begin by distinguishing between kinematics and dynamics.  Special relativity imposes only kinematic constraints on the structure of spacetime.  The dynamics of quantum physics and chaos theory encourages a view of nature as open and temporal, thus allowing for both human and divine agency.  The problem of the lack of universal simultaneity is lessened since simultaneity is an a posteriori construct.  Philosophically disposed to critical realism, opponents are wary of the incipient reductionism of the block view.  They resist the Boethian implications of relativity, and argue instead that divine omnipresence must be redefined in terms of a special frame of reference, perhaps one provided by the cosmic background radiation.  God’s knowledge of spacetime events in terms of this frame of reference will be constrained by both the world’s causal sequence and the distinction between past and future.  Similarly God’s actions will be consistent with relativity theory. 

In the end, is the debate merely philosophical or could it actually have scientific consequences?  Proponents of the block universe challenge their opponents to decide between a mere reinterpretation of the existing theories of physics and the much stronger claim that these theories should be changed.  If forthcoming, such changes ought to be testable empirically and would constitute a major achievement in the debate over time.  Proponents also point to additional complexities in the debate, such as the problem of giving a realist interpretation of quantum physics.  These problems become even more acute when dealing with quantum cosmology, making an atemporal interpretation almost inevitable.  They do not object to positing that God experiences the world through a special frame of reference or that God is aware of the experience of temporality of living creatures.  However they find it hard to understand how God’s action on the world can respect the causal constraints on such action entailed by special relativity.

Lucas, John R. “The Temporality of God."

John Lucas defends the temporality of God against both traditional theism and the difficulties raised by relativity and quantum cosmology.  For Lucas, the temporality of God is essential if we are to claim that God is personal and therefore conscious of the passage of time.  Against traditional orthodoxy and deism, Lucas cites both Barth and process theology in support of divine temporality.  Moreover, the Biblical witness is unalterably to a God who acts in specific ways.  Though God may be beyond time in the sense that time was created by God, Lucas insists that God is not timeless. 

But how can God experience the world in time if physics undercuts the temporality of the world?  Lucas argues that while special relativity on its own provides no absolute temporal reference frame, it is consistent with the possibility that one exists, such as the cosmic background radiation.  This in turn might provide a reference-frame by which God has temporal knowledge of the world.

What about the creation of the universe by God?  Lucas points out that the proposals by Hartle and Hawking and by Vilenkin explain the origin of the universe not as a result of conditions existing before the Big Bang, but as an instantiation of important rational desiderata.  This reflects a parallel move in the philosophy of science in which the kind of explanation sought shifts from a deductive-nomological explanation (in which temporally antecedent conditions evolve through covering laws) to a top-down explanation (i.e., the instantiation of desiderata).  Lucas concludes that time can be thought of modally as a transition from possibility to actuality. 

Murphy, Nancey. “Evidence of Design in the Fine-Tuning of the Universe."

The purpose of Nancey Murphy’s paper is to assess the possibility for using the ‘fine- tuning’ of the laws of nature in constructing a new design argument.  Her paper is closely linked with George Ellis’ of the same volume (George F.R. Ellis, “The Theology of the Anthropic Principle,” 363-400), but with an important difference:  she treats the thesis advanced by Ellis as an argument for the existence of God.  Murphy first shows how recent developments in philosophy overcome the traditional Humean objections to design arguments.  Carl Hempel’s theory that science employs hypothetico-deductive reasoning undercut Hume’s assumption that knowledge proceeds by induction.  Holist accounts of the structure and justification of knowledge, offered by both W. V. O. Quine and Imre Lakatos, show that a hypothesis is never tested on its own, but rather in conjunction with a network of beliefs into which it fits.  Lakatos provides a detailed theory about the structure of this network (or “research program”), in which a core theory is surrounded by a belt of auxiliary hypotheses which, in turn, are both supported and challenged by the data.  Lakatos’s structure includes theories of instrumentation for relating the data to the auxiliary hypotheses and a positive heuristic for the expansion of the auxiliary hypotheses into new domains of data. 

To avoid circularity and relativism, Lakatos provides external criteria for choosing among competing research programs.  A progressive program is one in which an additional auxiliary hypothesis must both account for anomalies, predict “novel facts,” and occasionally see them corroborated.  Moreover, such new hypotheses must fit coherently into the existing program.  “. . . The only reasonable way to assess the claim that fine-tuning provides evidence for divine creation is to consider the design hypothesis not as a claim standing alone . . . but rather as an integral part of . . . a theological research program” which can then be assumed as progressive or not.  But what should constitute the “data” for theology?  Murphy recognizes that this is a central issue for her proposal.  Her sources typically include both Scripture (incorporated through an appropriate doctrine of revelation; i.e., a “theory of instrumentation”) and experience.  Murphy suggests that the church’s practice of communal discernment could minimize the subjectivity of religious experience.

Murphy then reconstructs Ellis’ paper in terms of the Lakatosian structure.  Her aim is to show that theology can be regarded as a science, that cosmological fine-tuning can serve as an auxiliary hypothesis in a theological research program, and that theological theories can be compared directly with scientific theories. 

In what way is the Temple-Ellis program confirmed?  According to Lakatosian standards, it must produce novel facts, and Murphy claims that it does.  Ellis added an auxiliary hypothesis to Temple’s theology:  in order for there to be the free will required by Temple, God’s plan for the world had to include that the world be law-governed as well as fine-tuned.  The facts supporting the law-like character of the world were irrelevant to Temple’s theology, but in the Temple-Ellis program these facts now take on theoretical meaning.  The key here is that Ellis did not set out to explain the facts supporting the law-like character of nature but only the presence of free will in nature.  Thus, Murphy concludes, these facts are “weakly novel” since they were already known but irrelevant to Temple’s theology before Ellis modified it.  Finally Murphy suggests ways in which the Temple-Ellis program might be expanded to include the theological problems of theodicy, moral evil and natural evil, and the scientific discussion of thermodynamics, the arrow of time, and perhaps even consciousness.  These could make the program even more progressive.

Peters, Ted. “The Trinity In and Beyond Time."

The central concern of Ted Peters’ paper is how an eternal God can act, and be acted upon, in a temporal universe.  Classical theology made the problem particularly difficult by formulating the distinction between time and eternity as a “polar opposition.”  Peters’ fundamental move is to presuppose a Trinitarian doctrine of God, thus including relationality and dynamism within the divine.  By relating the economic and the immanent Trinity we take the temporality of the world into the divine life of God.  To substantiate this move, Peters turns to the understanding of temporality in physics and cosmology.  His overall aim is to show that the Trinitarian doctrine of God leads us to expect that the temporality of the world will be taken up eschatologically into God’s eternity.  

According to Peters, Scripture depicts God in temporal terms.  With the theology of Gregory of Nyssa, Augustine and Boethius, however, divine agency was understood as timeless, a view which came to pervade traditional Christian thought down to the present situation.  Might contemporary physics shed any light on this issue?  Peters’ cites Eleonore Stump, Norman Kretzmann, Ian Barbour and Holmes Rolston, each of whom suggest ways in which God might relate to the temporality of the cosmos.  Still Peters claims that, to the extent that their proposals conceive of eternity as timeless, they all fail to solve the underlying problem posed by God’s eternal experience of a temporal universe.  Can we instead conceive of God as “enveloping time,” transcending its beginning and its end and taking it up into the divine eternity?  According to Peters, Hawking would answer “no” to this question, for Hawking’s cosmology has no beginning and challenges the temporality of the universe as such.  Indeed Hawking draws “anti-theological” implications from his work:  with no initial singularity there is no need whatsoever for God.  Peters is critical of Hawking’s “anti-religious agenda” and points out that the God whom Hawking attacks is the God of deism, not the God of Christians, Jews and Muslims.  Moreover an alternate interpretation of the Hawking cosmology has been offered by Isham, who shows how God can be thought of as present to and active in all events of the universe even if there were no initial event. 

Peters then returns to the problem of reconceptualizing the divine eternity.  He is appreciative and yet critical of the thought of Wolfhart Pannenberg, who draws on holistic principles to interpret eschatology.  Such principles have important scientific as well as theological warrant.  Proleptic eschatology adds to the whole/part dialectic of science the claim that the whole is present as one part among others.  This theme is developed by Robert Jenson, who stresses that Yahweh’s eternity is “faithfulness through time,” and by Jürgen Moltmann, who turns to Christology and the dynamics of shared suffering to connect eternity and temporality.  This results in Moltmann’s modification of Rahner’s Rule:  the identification of the economic and the immanent Trinity will only be achieved eschatologically. 

Peters concludes by pointing to new directions for future research.  The doctrine of God might be required to explain the temporality of the world, including the arrow of time.  Moreover the movement between economic and immanent Trinity, through creation, incarnation, spiration and consummation, could be seen as bringing the history of creation into the life of God.

Polkinghorne, John. “The Laws of Nature and the Laws of Physics."

In his paper, John Polkinghorne defends a version of critical realism in which the process of discovering the laws of nature is interpreted “verisimilitudinously as the tightening grasp” on reality.  Yet these laws ought not be reduced to those of fundamental physics; instead our experiences of macroscopic nature are to be taken equally seriously.  Polkinghorne accepts a “constitutive reductionism” (in that we are composed merely of fundamental particles) but he opposes “conceptual reductionism” (since the laws of biology cannot be reduced to those of physics).  Thus constitutive and holistic laws must be combined in some way. 

Polkinghorne’s proposal is that, to our usual notions of upward emergence (which address the qualitative novelty of mind and life), we must add “downward emergence, in which the laws of physics are but an asymptotic approximation to a more subtle (and more supple) whole.”  Polkinghorne sees his approach as contextualist:  the whole and the environment influence the behavior of the parts.  It is guided by the principles of coherence (the need to explain the known laws of physics, given this wider view), historic continuity (the world must permit our experience of free agency), and realism (not only the general claim that the world can be known through science but the explicit claim that “epistemology models ontology”). 

The reality thus known must include the phenomena of mind-brain.  Polkinghorne admits that no solution to the mind-brain problem is forthcoming, but hopes that his is a “suggestive way of beginning.”  Here the dynamic theory of chaos provides a vital clue.  Chaotic systems, though governed by deterministic equations, are highly sensitive to environmental circumstances and initial conditions.  They represent a form of “structured randomness” whose intrinsic unpredictability, according to Polkinghorne’s form of critical realism, means they are in fact ontologically indeterminate.  Thus he concludes that the future is open, involving “genuine novelty, genuine becoming.”  This in turn allows for human intentionality and divine action.

Polkinghorne then conceives of the operation of agency as the exchange of “‘active information’, the creation of novel forms carried by a flexible material substrate.”  Here Polkinghorne is contrasting agency as the transmission of information with agency as causal influence, which would include the “transaction of energy.”  Information transmission thus becomes a very general characteristic of living processes.  Quantum physics provides similar insights to chaos theory, but Polkinghorne is cautious about relying on it.  We ought not confuse randomness with freedom, and we need to remember both that the interpretation of quantum theory is still in dispute and that quantum equations may not exhibit chaotic solutions. 

With chaos theory as a basis, Polkinghorne returns to a suggestion he has previously considered, that the mind/brain problem leads to a “complementary metaphysics of mind/matter.”  Here, however, he relates this suggestion to the problem of divine agency.  A conception of nature as open allows us to understand God’s continuing interaction with nature as “information input into the flexibility of cosmic history.”  This entails a “free-process” defense in relation to physical evil, a hiddeness to God’s action in the world and a limitation on that for which we can pray.  Polkinghorne rejects the criticism that his is a “God of the gaps” strategy, since the open character of chaotic processes are intrinsic gaps in nature revealed by science, not flaws in our knowledge of nature.  Likewise, he does not see himself making God into a finite causal agent, since God’s interaction with nature is through information, not energy.  Finally, in his proposal, God is highly temporal since the world is one of “true becoming.”  In this “dipolar (time/eternity) theism,” eternity and time are bound together in the divine nature.  God cannot know the future, since the future is not there to be known.  The divine kenosis thus includes an emptying of God’s omniscience.  But God is “ready for the future,” being able to bring about the eschatological fulfillment even if by way of contingent processes. 

Russell, Robert John. “Finite Creation without a Beginning: The Doctrine of Creation in Relation to Big Bang and Quantum Cosmologies."

Robert Russell’s paper is divided into two sections.  In the first section Russell focuses on inflationary Big Bang cosmology and the problem of t=0.  The theological reaction to t=0 has thus far been rather mixed.  Some (such as Peters) have welcomed it as evidence of divine creation; others (such as Barbour and Peacocke) have dismissed it as irrelevant to the core of the creation tradition.  As Russell sees it, the argument on both sides has been shaped by the work of Gilkey in his Maker of Heaven and Earth.  Here Gilkey acknowledges that the problem of relating empirical and ontological language is a fundamental issue for theologians, reflecting what he later calls the “travail of Biblical language.”  However Russell is critical of Gilkey’s resolution of the problem, which begins with his use of the traditional distinction between what can be called “ontological origination” and “historical/empirical origination.”  Gilkey, citing Aquinas, seems to view these as strictly dichotomous alternatives.  One then either rejects the latter as theologically irrelevant (Gilkey’s position) or elevates the latter into the essential meaning of the former (the position Gilkey rejects).  In the first case, science, insofar as t=0 is concerned, plays no role in theology; in the second case it plays a normative role. 

Russell criticizes both extremes by attempting to undermine Gilkey’s assumption that the alternatives should form a strict dichotomy.  Instead he believes that historical/empirical origination provides an important corroborative meaning for ontological origination, although it is neither its essential nor even its central meaning, a view, incidently, which he takes to be more in keeping with that of Aquinas.  Russell then argues that an important way of relating historical/empirical origination to ontological origination is through the concept of finitude.  This abstract concept, initially closely connected to ontological origination, can take on an important historical/empirical meaning in the context of cosmology, where the past temporal finitude of the universe is represented by the event, t=0.  Hence he argues that t=0 is relevant to the doctrine of creation ex nihilo if one interprets arguments about historical origination (such as found in t=0) as offering confirming, but neither conclusive nor essential, evidence for ontological origination.  In this way science plays a more vigorous role in the doctrine of creation than many scholars today allow but without providing its essential meaning.  In particular, taking a cue from the writings of Ian Barbour, Nancey Murphy and Philip Clayton, he frames his approach in terms of a Lakatosian research program in theology.  Creation ex nihilo as ontological origination will form the core hypothesis of this program, with t=0 entering as confirming evidence through the use of a series of auxiliary hypotheses involving the concept of finitude deployed in increasingly empirical contexts of meaning.

In the second section, he discusses the Hartle/Hawking proposal for quantum cosmology.  Their startling claim is that the universe, though having a finite age, has no beginning event, t=0; i.e., that the universe is finite but unbounded in the past.  How should this result affect the arguments in Part I?  To answer this, he first critically discusses the positions developed by Isham, Davies and Drees regarding the theological significance of the Hartle/Hawking proposal.  Next he presents Hawking’s own theological views and offers a counterargument to them.  Finally, his constructive position is that the Hartle/Hawking proposal, even if its scientific status is transitory, can teach us a great deal theologically.  First, given their work, we should distinguish between the theological claim that creation is temporally finite in the past and the further claim that this past is bounded by the event, t=0.  This leads to the important recognition that the first claim by itself is actually quite sufficient for creatio ex nihilo.  Hence Russell says we can set aside arguments specifically over t=0 and yet retain the historical/empirical sense of the past temporal finitude of creation.  Moreover, this insight, which he terms “finite creation without a beginning,” is valid whether or not the Hartle/Hawking proposal stands scientifically; thus it suggests that we can in fact work with “speculative proposals” at the frontiers of science instead of restricting ourselves necessarily to well-established results, as most scholars cautiously advise.  He views this generalization of the meaning of finitude as an additional auxiliary hypothesis to our research program, and following Lakatos again, look for novel predictions it might entail and without which it would be ad hoc.

Thus, Russell analyzes the temporal status of the universe in terms of quantum gravity and general relativity.  The variety of ways time functions here (external, internal, phenomenological) and their implications for the temporality of the universe lead to important new directions for understanding God’s action as creator and the doctrine of creation.  From one perspective, the combination of quantum gravity and general relativity describes the universe as having domains of a temporal, of a timeless, and of a transitional character.  Accordingly we must reconsider God’s relation as Creator to each of these domains.  Here the generalization of the concept of finitude to include an unbounded finitude might allow us to claim the occurrence of the transition domain as a Lakatosian “novel fact” of our research program.  From a different perspective, however, taken as quantum gravity the fundamental theory replacing general relativity.  Here God’s relation to the universe as a whole will need to be reinterpreted in terms of the complex role and status of temporality in quantum gravity.  In either case, God’s activity as Creator is not limited to a ‘first moment’ (whether or not one exists) but to the entire domain of nature, returning us to the general problem of divine action in light of science. Russell closes by pointing, then, to the need to rethink the current models of divine agency and of the relation between time and eternity in terms of a more complex understanding of temporality from a Trinitarian perspective informed by quantum physics and quantum cosmology.

Stoeger, William R. “Contemporary Physics and the Ontological Status of the Laws of Nature."

How should we think of the laws of nature?  Bill Stoeger poses this as “an absolutely crucial question” underlying the entire discussion of science, philosophy and theology.  In his essay, Stoeger defends the thesis that the laws, although revealing fundamental regularities in nature, are not the source of those regularities, much less of their physical necessity.  They are descriptive and not prescriptive and do not exist independently of the reality they describe.  Stoeger thus rejects a “Platonic” interpretation of the laws of nature.  They have no pre-existence with respect to nature; this means that they do not ultimately explain why nature is as it is.  Instead, the regularities which the laws of nature describe stem from the regularities of physical reality itself, a reality whose complexity subverts any attempt at a reductionist approach to science.  Thus a “theory of everything” is ruled out, and the possibility in principle of God’s acting in the world is strongly affirmed. 

The laws of nature are approximate models, idealized constructions which can never be complete and isomorphic descriptions of nature.  Prevailing theories are eventually replaced or subsumed, often entailing a radically different concept of nature.  Moreover, no theory, no matter how complete, can answer the ultimate question:  why nature is as it is and not some other way.  Stoeger is thus critical of those realists who make excessive claims about the correspondence between theory and the structures of reality.  “The illusion that we are somehow discerning reality as it truly is in itself is a pervasive and dangerous one.”  Stoeger also argues that ontological reductionism and determinism are untenable.  The laws of nature are in fact human constructions guided by careful research.  The intermediate-level regularities which they model originate “in the relationships of fundamental entities in a multi-layered universe,” many of which remain beyond our purview.  An understanding of the ultimate origins of these underlying regularities takes us to the limits of what can be known. 

Stoeger’s account of the status of the laws of nature leads him to argue that the laws neither exist independently of the universe nor are they prescriptive of its behavior.  It thus does not make sense to suppose that there may be other sets of actual or potential laws that might describe universes different from our own.  This reduces the cogency of “many-worlds” arguments which hypothesize the existence of other universes as a means of explaining away the (supposed) fine-tuning of our own universe.

Stoeger then turns to the problem of divine action in light of his nuanced realism.  God can be thought of as acting through the laws of nature.  However the term ‘laws’ refers here to the underlying relations in nature and not principally to our imperfect and idealized models of them.  Moreover, as their ultimate source, God’s relationship to these laws will be ‘from within’ and God will not need to formalize it.  Our relationship to them will always be ‘from without’ and it will be only partially manifested through our laws.  Finally, as imperfect models of the regularities and relationships we find in nature, our laws only deal with general features in nature.  They cannot subsume the particular, special and personal aspects, though these aspects are part of the deeper underlying regularities and relationships of nature.  It is through these aspects, as well, that God acts.   

Ward, Keith. “God as a Principle of Cosmological Explanation."

In his paper Keith Ward moves ‘both ways’ between theology and cosmology.  He begins with a summary of the traditional doctrine of creation:  God is a non-spatio-temporal being, transcending all that is created, including spacetime, although immanent to all creation as its omnipresent Creator.  Divine eternity is thus timeless, for God has neither internal nor external temporal relations.  The act of creation is one of non-temporal causation.  Whether there was a first moment is irrelevant to the doctrine.

Ward admits that this view of God is congruent with the block universe interpretation of special relativity, but he is highly critical of it.  Ward maintains that the doctrine of creation does not entail a timeless God.  Although God transcends spacetime as its cause, God is nevertheless temporal, since “. . . by creating spacetime, God creates new temporal relations in the Divine being itself.”  Allowing God to have temporal relations makes it possible for God to act in new ways, make new decisions and bring into being in time an infinite number of new things.  The inclusion of divine contingency along with divine necessity enriches the concept of omnipotence. 

Ward distinguishes his view of God from that of process theism.  He maintains God’s omnipotence and still affirms free will by appealing to divine self-limitation.  The advantage over Whitehead is that God’s omnipotence will “ensure that all the evil caused by the misuse of creaturely freedom will be ordered to good . . .” 

Ward then relates nomological models, which are dominant in physics and involve general principles and ultimate brute facts, to axiological models, which arise in the social sciences and describe the free realization of ultimate values.  A nomological model realizes an aesthetic value, since the laws of nature are elegant and simple.  An axiological model is ultimately factual, since values arise out of the natural capacities of sentient beings as described by physics and evolutionary biology.  This inter-relationship is central to the Christian claim that “. . . goodness is rooted in the nature of things, and is not some sort of arbitrary decision or purely subjective expression of feeling.”

Quantum cosmologists attempt to offer a secular explanation of ultimate brute facts, but this minimizes the importance of freedom, creativity, and the realization of values.  Theism can offer a comparable explanation of nature, but its advantage lies in its combination of nomological and axiological explanations.  Theism is thus “the best possible intelligible explanation of the universe” and “the completion of that search for intelligibility which characterizes the scientific enterprise.”  He urges that we reconstruct the doctrine of creation in terms of creative emergence, i.e., the novel realization of intrinsic values grounded in the divine nature and emerging through the cooperative acts of rational creatures. 

Modern cosmology “sets the notion of Divine action in its broadest and most all- embracing context.”  The laws of nature realize God’s purposes, understood as potentialities in the structure of reality and not interferences from an alien power.  Miracles are “transformations of the physical to disclose its spiritual foundation and goal . . .”  Thus theism “can be seen as an implication of the scientific attitude itself, and the pursuit of scientific understanding may be seen as converging upon the religious quest for self-transforming knowledge of God . . .”

Quantum Mechanics

This collection of fifteen essays explores the creative interaction among quantum physics, philosophy, and theology. This fine collection presents the results of the fifth international research conference co-sponsored by the Vatican Observatory, Rome, and the Center for Theology and the Natural Sciences, Berkeley. The overarching goal of these conferences is to support the engagement of constructive theology with the natural sciences and to investigate the philosophical and theological elements in ongoing theoretical research in the natural sciences. In the first section of this collection, contributors examine scientific and historical context. Section two features essays covering a wide range of philosophical interpretations of quantum mechanics. The final set of essays explores the theological implications of quantum theory.

Berry, Michael. “Chaos and the Semiclassical Limit of Quantum Mechanics (Is the Moon There When Somebody Looks?)"

Michael Berry’s essay addresses the problematic relation between the presence of chaos in classical mechanics and its absence in quantum mechanics. If classical mechanics is the limit of quantum mechanics when Planck’s constant h can be ignored, why does a system appear nonchaotic according to quantum mechanics and yet chaotic when we set h = 0? Moreover, if all systems obey quantum mechanics, including macroscopic ones like the moon, why do they evolve chaotically? Berry’s approach is to locate this problem within a larger one: namely the mathematical reduction of one theory to another. His claim is that many of the problems associated with reduction arise because of singular limits, which both obstruct the smooth reduction of theories and point to rich “borderland physics” between theories. The limit h _ 0 is one such singular limit, and this fact sheds light on the problem of reduction in several ways. First of all, nonclassical phenomena will emerge as h _ 0. Secondly, the limit of long times (t _ ñ), which are required for chaos to emerge in classical mechanics, and the limit h _ 0, do not commute, creating further difficulties.

To illustrate the role of singularities in the semiclassical limit, Berry first considers a simple example: two incident beams of coherent light. Quantum mechanics predicts interference fringes, and these fringes persist as h _ 0 due to the singularity in the quantum treatment. But in the geometrical-optics form of classical physics (where the wave-like nature of light is ignored) there are no fringes, only the simple addition of two light sources. To regain the correspondence principle between classical and quantum mechanics we must first average over phase-scrambling effects due to the influence of the physical environment in a process called “decoherence.”

A second, more complex, example illustrates the relation between these singularities and chaos. Berry describes the chaotic rotational motion of Hyperion, a satellite of Saturn. Regarded as a quantum object, Hyperion’s chaotic behavior should rapidly be suppressed. Remarkably, however, the suppression is itself suppressed due to decoherence: even the “kicks” from photons from the sun on Hyperion are enough to induce decoherence. This means that, while it is true that chaos magnifies any uncertainty, in the quantum case the magnification would wind up suppressing chaos if this suppression were not itself suppressed by decoherence induced by interactions with the environment.

Finally, Berry turns to emergent semiclassical phenomena. These phenomena do not involve chaos, and unlike more familiar examples of macroscopic quantum phenomena such as superfluidity, their detection requires magnification. His first example is the focusing of a family of light trajectories, such as rainbows or light patterns in a swimming pool. These patterns, or caustics, are singularities in geometrical optics. But upon microscopic examination, caustics dissolve into intricate interference patterns which catastrophe theory describes as emergent semiclassical phenomena called diffraction catastrophes. His second example is spectral universality: if we consider quantum systems whose classical mechanical treatment is chaotic, we find that the statistics of the spectra of all such systems is the same. Spectral universality is nonclassical, because it is a property of discrete energy levels, and it is semiclassically emergent because the number of levels increases in the classical limit, h _ 0. Berry’s conclusion is that, as we generalize to a deeper theory, the singularities of the old theory are dissolved and replaced by new ones.

Butterfield, Jeremy. “Some Worlds of Quantum Theory."

The over-arching aim of Jeremy Butterfield’s essay is to discuss the strange ontologies that emerge from various interpretations of quantum mechanics. He begins with various proposed solutions to the measurement problem and then focuses on the Everettian interpretation developed by Simon Saunders and David Wallace. At the very outset, however, Butterfield stresses the highly problematic character of quantum indeterminism: it only appears in some interpretations of quantum theory and it involves a highly nonclassical ontology. He acknowledges the enormous empirical success of quantum theory but notes that considerable problems arise in reconciling it with special and general relativity, and he argues strongly against reductionism. He then provides a brief summary of the formalism of quantum theory, including a discussion of pure states, mixed states, and the meaning of probability in quantum theory.

Butterfield begins his discussion of the measurement problem with the orthodox view: during the process of measurement both the quantum system and the measurement apparatus go to a definite state (the “collapse of the wavepacket” for both system and apparatus). Yet according to the linearity of the Schrödinger equation, there is no such “collapse” - the indefiniteness of the quantum system should be transmitted to the measurement apparatus, leaving the position of the “pointer” indefinite. How then can we justify ascribing a definite state to the pointer? According to Butterfield, recent work on decoherence suggests that the continual interaction of the pointer with its environment brings the pointer “very close” to a definite state by leaving it in a narrow mixture of definite states. Still, a complete solution to the measurement problem requires more than what decoherence can provide.

We thus face two choices. The first choice is either to abandon orthodox quantum theory’s law of temporal evolution (the Schrödinger equation) and to propose new equations so that the collapse of the wavepacket is a physical process (Butterfield calls this choice Dynamics), or to assign values to additional quantities not given by orthodox quantum theory (Extra Values). The second choice is either to secure definite values for familiar quantities like the position of macroscopic objects (DefMac) or to be satisfied if the macrorealm only appears definite (DefApp). Combining these two choices yields four broad strategies for solving the measurement problem. An example of the Dynamics/DefApp strategy is that of Wigner and Stapp, where mind or consciousness produces the collapse of the wavepacket, while the ExtraValues/DefMac strategy of de Broglie and Bohm ascribes definite values to the position of the quantum system and introduces a “pilot-wave” to guide the system.

Butterfield then develops the ExtraValues/DefApp strategy proposed by Everettians. Here one retains the Schrödinger evolution of a strictly isolated system and allows an indefinite macrorealm, but posits extra values to secure definite appearances (measurements) by appealing to decoherence. In its simplest version, this strategy ascribes a wavefunction, Y (specifically, a pure state), to the universe as follows: Y is a superposition corresponding to numerous different “macrorealms” (or “worlds” or “branches”) that evolves according to the Schrödinger equation. According to Butterfield, the Everettians now propose a “breath-taking main idea”: all these macrorealms are actual. He devotes much of the remainder of his essay to the meaning of macrorealms from the perspective of both “many-minds” and “many-worlds” interpretations, how macrorealms evolve over time, and what he sees as the attendant vagueness in these approaches.

His primary focus is the work of Saunders and Wallace, including their appeal to anthropocentrism, their definition of ‘world’ as relative state, their use of decoherence, and their arguments against a precise definition of ‘branch’. A central concern is that of identity over time. Philosophers treat change of properties over time in two rival ways: either as properties of objects that persist self-identically over time (i.e., objects that “endure”), or as properties of objects having stages or temporal parts (i.e., objects that “perdure”). Similarly, for Everettians, we could say that the pointer has different properties (e.g., positions) in different worlds, or that there are multiple pointers similar to one another (“copies”) but with different properties in different worlds. He discusses the pros and cons of the “copy” choice most Everettians make, and he endorses the analogy between “worlds” as conceived by Everettians and instants of time represented by a “block universe” conception of time.

In his final section, Butterfield discusses the dynamics of the universe as a whole and of its subsystems. Everettians find the deterministic evolution of the universe as a whole to be compatible with the indeterministic evolution of its open subsystems and with the “almost deterministic” evolution of isolated subsystems. He closes by suggesting that, although the relativistic invariance of the universe as a whole is clear, a lacuna remains concerning the relativistic invariance of its subsystems.

Chiao, Raymond Y. “Quantum Nonlocalities: Experimental Evidence."

The purpose of Raymond Chiao’s essay is to show, by a careful discussion of specific experiments, that the world possesses at least three kinds of nonlocal action-at-a-distance. Chiao first defines action-at-a-distance in general as a correlation between effects, events, or conditions separated by a spacelike interval. (If a light signal cannot be sent between two events, their separation in space and time is called a “space-like interval.”) Quantum nonlocality, in particular, is a form of action-at-a-distance which has no classical explanation. But does quantum nonlocality violate special relativity? Not according to Chiao, for two reasons: quantum nonlocality never reverses the order of cause and effect and in any case, and quantum events cannot be used for signaling because of the fundamentally probabilistic, uncontrollable nature of quantum events.

Chiao then interprets all three kinds of quantum nonlocalities as resulting from the superposition principle (i.e., quantum interference) in which the sum of allowable states is also an allowable state. In the first and third examples, namely the Aharonov-Bohm effect and the tunnel effect, nonlocality arises out of single-particle interference. In the second case, the Einstein-Podolsky-Rosen effect, non-locality involves two-particle interference, i.e., an entangled state.

1. In the Aharonov-Bohm experiment, a beam of electrons is split, one beam passing through the hole of a superconducting torus, the other around the torus. After being rejoined, the beam displays a single-particle interference pattern whose phase shift depends on the magnetic flux contained by the torus. Chiao sees this kind of quantum non-locality as topological in nature: it is the global topology of the split beams that lead to the local, interference, effect. The phenomenon arises from the local gauge invariance of the electromagnetic interaction and it can be used to explain the Lorentz force.

2. In the EPR experiment with two particles, quantum nonlocality arises from the “nonfactorizability” of the quantum states: since the two-particle state of the system is the superposition of the products of the two states of the individual particles, the superposition cannot be factored mathematically into separate states for each particle. Being so entangled, the state of the system, when measured, depends on the states of both particles in the system regardless of the distance separating them. In the 1960s, John Bell showed that EPR results violate the philosophical assumptions Einstein and his colleagues made in defending “local realism.” Bell then proposed that properties of particles, such as position, momentum, spin, etc., do not exist until they are observed, reminiscent of Berkeley’s idealism. Chiao presses this point further, claiming that the nonfactorizability of the entangled states implies the “nonseparability” of the quantum world.

Chiao then describes the EPR experiment performed in his lab, in which pairs of photons are prepared in an entangled state by spontaneous parametric down-conversion in a nonlinear crystal. Chiao used Franson’s modification where Mach-Zehnder interferometers replaced Fabry-Perot and Michelson interferometers in the detection process. This leads to a modified Bell inequality in which the twin photons possess neither definite energy (color) nor a definite time of emission prior to their detection. Moreover, nonlocality is further demonstrated by the fact that a change in the path length of one of the interferometer arms changes the behavior of the photon which passes along the other, unchanged, interferometer arm.

3. The third kind of non-locality occurs in quantum tunneling where certain kinds of superluminal velocities are possible during tunneling. Here two photons are emitted simultaneously and their arrival times at equal distances are measured. If a barrier is inserted into one of the paths, the difference in the time of arrival constitutes a precise definition of the tunneling time. But will the photon traversing the tunneling path arrive before or after the photon following along the free path? In theory, a superluminal result is possible, in which the tunneling photon arrives first. Chiao shows why this result does not violate relativity: relativity allows for superluminal group velocities and only forbids superluminal front velocities. Moreover, such superluminal effects are governed by the uncertainty principle, and thus cannot constitute a controllable signal.

He then describes in detail the resulting experiment using a Hong-Ou-Mandel interferometer. Chiao’s results help decide between three conflicting theories about how to define tunneling time, and they showed that the tunneling process is indeed superluminal in the allowable sense. According to Chiao, such superluminal tunneling implies a third kind of non-locality: an observer moving past the barrier at close to the speed of light would infer that the particle exists simultaneously at both the entrance and the exit faces of the barrier!

Chiao concludes with some additional philosophical and theological reflections in light of these results and his Christian faith. He supports a “neo-Berkeleyan” point of view in which the free choices of observers lead to nonlocal correlations of properties of quantum systems in time as well as in space, giving Berkeley’s dictum, esse est percipi, temporal as well as spatial significance. Theologically he uses this generalized Berkeleyan point of view to depict God as the Observer of the universe. Here God creates the universe as a whole (ex nihilo) and every event in time (creatio continua). The quantum nonseparability of the universe is suggestive of the New Testament’s view of the unity of creation. In the process Chiao discusses such ideas as the quantum entanglement of all events in the universe given their common origin in the Big Bang, and he responds to the challenge of the quantum Zeno paradox.

Clarke, Chris. “The Histories Interpretation of Quantum Theory and the Problem of Human/Divine Action."

The overall aim of Chris Clarke’s essay is to show how a modification of the consistent- histories interpretation of quantum mechanics provides a natural setting for understanding human and divine action. For Clarke, religion is largely about finding the meaning of the “good life,” and our aim is to help people live it. Hence we tell stories about the world we live in, some deriving from science, others from the great myths of religion. Weaving them together is important even though no part of the story represents a reality independent of ourselves and not all parts are equally supported by experiment. Clarke draws upon the phenomenology of Heidegger and Merleau-Ponty to argue that objective reality is located in the “second-person relationship” that lies between subject and object. Quantum theory plays an essential role in Clarke’s understanding of “the interplay of self, society, and Other” through which the concreteness of the world emerges.

Clarke then turns to the central problem of quantum mechanics: if it is a generalization of, and not an alternative to, classical mechanics, should we not use it to describe all of physics in a unified way? Yet if we do so, the theory predicts that macroscopic phenomena will be superpositions of states “flagrantly at variance with our experience,” as the Schrödinger’s cat experiment vividly depicts. Bohr and Von Neumann avoided this problem by dividing the world into the quantum system and its classical environment, and they characterized these realms by two separate time developments. But Clarke’s goal is an overall picture of the world that places the observer and the observed system on the same footing. To do so, Clarke focuses on the consistent-histories approach in which state reduction is unnecessary and only appearances are definite.

A “histories” approach links a sequence of preparation and measurement pairs such that each measurement becomes the preparation for the next. A “consistent” histories approach tries to rule out superpositions of macroscopically distinguishable states by considering only those histories whose probabilities obey classical logic. The approach was introduced by Robert Griffiths in 1984 and then extended to cosmology by Murray Gell-Mann and James Hartle, but problems were soon raised by Dowker and Kent. Clarke’s hope is to reformulate the approach to avoid these problems and then relate it to human and divine agency. To do so, he focuses on how we might restrict the possibilities of future histories given a fixed and acceptable history up to the present, looking at unmodified dynamics within sets of histories. This leads Clarke to propose a specific definition of consistency in terms of logical exclusivity and the rules of quantum mechanics, the physical significance of which arises through decoherence. As it turns out, although all classically acceptable histories are consistent, not all consistent histories are acceptable; additional structure is needed to single out acceptable histories. According to Clarke, the past history from which the future is predicted can help provide such a structure. This approach does not divide the world into classical and quantum domains, nor does it involve the collapse of the wavefunction. Moreover, in this approach, the history of the world, as both contingent and governed by decoherence, accounts for why the universe will continue to be classical.

Clarke then turns to the issue of human agency and, by analogy, divine action. According to Mae-Wan Ho, living organisms exhibit coherence, maintaining phase relations between the quantum states of their constituents over considerable distances and times. If so, our experience might not obey classical logic, and the nonBoolean aspects of an organism’s own history may be observationally detectable by an external observer. Quantum mechanics thus opens the possibility that we can share histories, at least momentarily. This provides Clarke with a way to engage the “other minds” problem. Drawing on Heidegger and Levinas he discusses the objective world in terms of the interrelation of beings-in-the-world. The “co-creation of the universe” then arises through the set of such intermeshing histories. The consistent-histories approach also allows us to move beyond the “determinism vs. random” debates about free will. Instead, decision-making involves a shift from one consistent Boolean logic to another. We experience this as creativity, though the shift appears random to others. Our free will is thus characterized by the simultaneous creation of volition and a framework of meaning which justifies this volition.

Clarke describes his experience of the divine as of a guidance that is immanent in the concrete flow of events and yet transcendent, not contained in any horizon. He uses the idea of “entrainment” in quantum theory, where previous events are realized in the present, to describe divine action as top-down entrainment, coordinating and informing the individual acts of will that it contains, rather than as a divine influence at the atomic level. This concept of divinity goes beyond pantheism, but coheres with panentheism and with a view of divinity embodied in the world as suggested by the second mahavakya of the Upanishads.

Clayton, Philip. “Tracing the Lines: Constraint and Freedom In the Movement from Quantum Physics to Theology."

Philip Clayton gives two reasons why constructive theology should engage in dialogue with quantum physics: it cannot afford the fideistic position that results from disengaging with science, and it should seek a more hypothetical, fallible, and revisionist method than traditionally allowed, thus opening itself up to the engagement without becoming fully relativistic. Clayton’s method will be to consider an array of interpretive models in a specific scientific field. He will then look for areas of compatibility with theology, in the process revising both theology and science. Eventually, he will repeat this activity across a variety of disciplines. But why should physics provide constraints on how God might act? Clayton’s response: it can do so if divine (or human) agency occurs in the physical world in conformity with physical law. This obviously holds for us, and it may indeed hold for the way God chooses to act. It thus becomes Clayton’s “wager”: the structure of the physical world sets parameters on, and tells us about the manner in which, God can act. He locates this position midway between those who give a purely subjective account of theology or who worry that quantum mechanics will not be helpful for divine action, and those who seek even stronger theological conclusions from science or possibly the convergence of science and religion.

Clayton then explores three quantum mechanical constraints on divine action. The first is the role of the observer. Minimalists focus on macroscopic measurements by an observer who is never within the quantum mechanical system being studied. Maximalists introduce subjectivity and consciousness in explaining a quantum experiment, despite the resistance of many physicists. Here Clayton finds another crucial issue at work: reductionism assumed by minimalists versus emergence and even dualism assumed by maximalists. The second issue is the “many-worlds” interpretation as represented by Hugh Everett, Bryce DeWitt, and others, compared with those who defend the irreducible role of subjectivity in nature, such as Eugene Wigner, John von Neumann, John Wheeler, Henry Stapp, and Roger Penrose. Both of these interpretations are deeply influenced by metaphysics: physicalists who accept a branching universe versus subjectivists who view quantum mechanics as evidence of mind as irreducible in nature. Clayton then turns to his third issue, indeterminism and free will. He reminds us how the early defenders of the Copenhagen view saw the free choice of an experimenter as playing an irreducible role in the outcome of the experiment. Despite counterarguments, Clayton claims that ontological indeterminism remains a significant factor in these debates: it seems a necessary condition for an incompatibilist view of free will, particularly if incompatibilist free choices are to be enacted in the world. In turn one can argue that God so created the world as to allow for human freedom.

Clayton then argues that questions like these three show that physics and philosophy lie on a continuum, particularly when the philosophical questions are closely connected with physics research. Even theology lies on this continuum, though it is further removed from physics research than from philosophy. Clayton now turns to three additional issues. First he considers Bernard d’Espagnat’s ontology, terming it “Spinozistic Monism.” Here the state vector expresses properties of a deeper, underlying reality which we can never describe in itself but which is manifested in what we observe and which can be understood as Being. Next Clayton engages critically those who interpret quantum physics in terms of Eastern mysticism, including Capra, Bohm, and Wilber. Though their stress on holism may be compelling, their metaphysical conclusions, like any others, are options not directly supported by physics. Finally Clayton turns to theistic metaphysics, considering both classical theism and panentheism. Theism asserts that the world as it appears to us is real and that it has its origin in an ultimate principle called spirit. The divine spirit is an active principle in this world and is in many ways personal. Classical theism has advantages over the preceding views, but it can become problematic if it places too great a distance between God and the world; the analogy with human agency breaks down for a fully disembodied view of God. To Clayton, panentheism avoids some of these difficulties, particularly as it understands the world to be within God even while, as with classical theism, God is more than and distinct from the world. Here each physical event can be an expression of divine agency in a “top-down” manner which does not violate physical law. It also provides a metaphysics that coheres nicely with some of the interpretations of quantum physics previously discussed, particularly those which stress holism, veiled reality, interconnection, and interdependence.

Cushing, James T. “Determinism Versus Indeterminism in Quantum Mechanics: A “Free” Choice."

James T. Cushing sees the question of determinism versus indeterminism as “the fundamental issue” regarding the possibilities for particular divine action, and thus the importance of quantum mechanics. His central point is that “considerations of empirical adequacy and logical consistency alone” do not force one to chose the indeterministic view of quantum mechanics as found in the Copenhagen interpretation - Bohm offers an empirically valid deterministic alternative.

Cushing begins by defining a theory as a formalism (i.e., a set of equations and a set of calculational rules for making predictions that can be tested) and an interpretation (i.e., what the theory tells us about the underlying structure of the world). Quantum mechanics presents us with one formalism but two different interpretations, and thus two different theories. The Copenhagen version affirms the completeness of the state-vector description, the principle of complementarity, nonclassical probability, and a prohibition against “any possible alternative causal description in a spacetime background.” Physical processes are thus both indeterministic and nonlocal. Bohmian mechanics is an objective (realist) and deterministic account in which the positions of the particles of the system function as “hidden variables” and must be included in a complete state description. As in the Copenhagen interpretation, the Schrödinger equation governs the evolution of the wave function, but an additional “guidance condition” governs the evolution of the particles’ positions. With the inclusion of a quantum-equilibrium statistical distribution, Bohm’s theory is empirically identical with standard quantum mechanics. Its ontology depicts particles following definite trajectories that are completely deterministic and observer-independent. The ontology, however, is nonlocal: instantaneous, long-range influences are included. Still Bohmian nonlocality is “benign,” since the “no-signaling” theory of quantum mechanics prohibits sending messages faster than light.

In Bohm’s theory, the quantum potential U conveys the influence of the environment on the particle, while U is determined, in turn, by the wavefunction. This means that the measurement process is “an act of discovery - there is no quantum-mechanical measurement problem.” All observations are, ultimately, position measurements, a feature which reflects our own existence in coordinate space. The classical limit corresponds to U being negligible and is thus more precise than h _ 0. From an empirical perspective, Bohm’s theory is not only completely equivalent with standard quantum mechanics, but it also captures Bohr’s concept of quantum holism and his principle of complementarity. As Cushing puts it, observed values in Bohm’s theory are “contextual.”

Bell’s theorem shows that our world cannot be both objectively real and local. Cushing suggests that locality is the real problem, but reminds us that Bohm offers a nonlocal, deterministic hidden-variables theory. In order to discuss Bohmian ontology, Cushing points to “relational holism” since it seems to offer a better conceptual framework than one which distinguishes between separability and locality. It also suggests a world of temporal becoming since it includes a preferred frame for instantaneous action. Still this world is one in which everything, including the future, is determined. Such a world is reminiscent of Newton’s idea of space as the divine sensorium. It certainly poses a challenge to our ideas of free will and divine action - as does the problem of evil.

In short, then, the choice to accept the Copenhagen view and reject that of Bohm is not a forced move based on logic or empirical adequacy; it is made on other grounds. Similarly, one might chose an indeterministic view of quantum mechanics for theological reasons, but one should not claim that quantum mechanics provides independent, scientifically arguable grounds for such a choice.

Ellis, George F.R. “Quantum Theory and the Macroscopic World."

George F.R. Ellis lays out a thoroughgoing critique of reductionism and a complex ontology for reality as a whole, drawing from his previous publications and offering new reflections in light of quantum theory. Against reductionism, Ellis claims that nature is hierarchically structured, with emergent levels of order and meaning, as well as bottom-up and top-down action, occurring throughout the hierarchy. He begins with classical physics, chemistry, and biology, where reductionism is framed in terms of micro-to-macro relations of bottom-up deterministic causality. But Ellis notes that quantum processes give rise to the regularities of the classical world, and that they can have macroscopic results in the classical world. Ellis’s examples include amplifiers such as instruments (e.g., photomultipliers), biological organs (e.g., the eye) and processes (e.g. genetic mutations expressed in the organism); coherent implementation of micro-effects; “essentially quantum effects at the macrolevel” (e.g., superconductivity); and quantum entanglement (e.g. Bose-Einstein condensate). These examples undermine the reductionist claim that the properties of parts entirely determine the properties of wholes.

He then gives numerous examples of macro-to-micro relations indicative of top-down action in physics (e.g., nucleosynthesis), in biology (e.g adaptation, expression of genetic information), and in human volition (e.g., intentions that lead to actions). Quantum physics provides several ways to understand how these phenomena can occur at the level of physics: interaction potentials, experiments and the collapse of the wavefunction, state preparation, decoherence, and the arrow of time. These features, according to Ellis, discredit reductionism in several ways. In hierarchically structured systems there is top-down action as well as bottom-up action. The outcome, even when there is determinism and mechanism at the microlevel, is partially effected by the context of boundary conditions, macroconstraints, and macroinfluences. Here systems thinking, based on synthesis, is needed as well as reductionistic analysis. Quantum entanglement provides another crucial argument against reductionism. Not only do cooperative effects between constituents of entangled states modify their individual behavior, but entanglement makes it hard to speak in terms of independent properties of constituents parts. Quantum uncertainty further undermines reductionism not only in microsystems but also in macrosystems when micro-uncertainties are amplified. Thus simplistic ideas of reductionism, that “we are nothing but the sum of particles controlled by forces at the microlevel,” do not hold. They must be replaced by more sophisticated views integrating both bottom-up (“microcausation”), bottom-bottom (“co-operative”), and top-down (“context-setting”) interactions. Still, while Ellis sees reductionism as wrong in principle, reductionism in practice is legitimate. He offers criteria for when it is suitable (e.g., when bottom-up causation dominates and microcomponents maintain their individual properties) and when it is not suitable (e.g., when either top-down causation is involved or when cooperative phenomena change the behavior of component parts). He closes this section with comments on two issues related to quantum theory and ontology: the possibility of chaotic/fractal structures in quantum processes, and the status of the theoretical and mathematical terms in quantum theory (e.g., are they human invention or Platonic realities?).

Ellis then develops an elaborate ontology to describe the many different aspects of material, human, and ideational reality, building on the works of Popper, Eccles, Penrose and others. In his hierarchical structure, ontological status and phenomenological laws of behavior are assigned to higher, as well as lower, levels. His ontology includes: Worlds 1 (the physical world), 2 (individual and communal consciousness), 3 (Aristotelian possibilities), 4 (Platonic abstract realities), and 5 (underlying purpose). He offers for their foundation and ultimate context, World 0 (God), and he describes the complex ways these Worlds relate to each other. He concludes with a discussion of God’s action in the world, drawing from his previous publications with Nancey Murphy. Divine action is kenotic, revealing God’s purpose and the ethical core of God’s nature through personal religious experience. Divine action, in turn, requires an openness in physical processes such that God’s action has real causal effects in the physical world. The ontological nature of quantum uncertainty provides such openness. “The outcome of quantum measurements are fully under God’s control, while being apparently random to humans...” Such effects at the quantum level can, in turn, affect the macrolevel without calling on chaos theory or getting “entangled in the problem of quantum chaology.” He defends his view of divine action in light of problems such as quantum randomness, the existence of micro-to-macro effects, the suggestion that such divine action would be episodic, a possible violation of the conservation of energy, and other challenges.

Heller, Michael. “Generalizations: From Quantum Mechanics to God."

According to Michael Heller, the evolution of concepts is a driving force in science. New concepts inherit much from their predecessors and yet are open to future generalizations. When they produce paradoxes and inconsistencies, a crisis arises which can be called a conceptual revolution or, more properly, an evolution within a conceptual framework. One example of a conceptual evolution is the origin of rational discourse about the world begun in sixth-century BCE Greece. Another is current research in quantum mechanics. A sign of such conceptual evolution is an increasing generalization in which the old concepts are restricted to a smaller domain of validity than they originally enjoyed. Both science and theology can be seen as attempts to catch reality in a net of concepts and theories, attempts which always fail. Still we should not remain silent, at least about God, since it is better to say something even if it is always tentative. All language demands interpretation, as quantum physics clearly shows. The breakdown of language in physics points to the need to generalize; perhaps theology could learn something by analogy from this fact. Thus the goal of this essay is to look at contemporary quantum theory and to derive from it a lesson for theology.

Heller’s starting point is the fact that the main distinguishing feature of quantum mechanics is its noncommutativity; he seeks to show the degree of generalization already present in quantum theory by using the recently discovered noncommutative geometry. It not only clearly shows the generalizing mechanisms underlying the present theory, but it also points towards further possible generalizations. Heller explores the possibility that at its fundamental level, physics is modeled by noncommutative geometry. Quite independently of whether this hypothesis will prove true, he claims that we can learn a lesson from it. Heller analyzes a few concepts, such as causality, probability and chance, which are of great importance for philosophy and theology when they are transferred from their usual context to the environment of the “noncommutative world.” The main characteristic of this world is its a-temporality and a-spatiality. It turns out, for instance, that in this a-temporal world authentic dynamics (albeit in a generalized sense) is possible.

Heller does not claim that concepts elaborated in noncommutative geometry can be used directly in theology. Instead he tries to draw consequences for theological discourse from the fact that even in physics some concepts undergo such drastic evolution that they distance themselves from our everyday linguistic intuition. He begins with an algebraic formulation of quantum mechanics based on general C*-algebra; this formulation allows one to recover the more limited formulation in terms of Hilbert spaces. C*-algebras that are relevant to quantum mechanics are noncommutative algebras, and it is this noncommutativity which is responsible, according to Heller, for all the peculiarities of quantum theory. Algebraic formulation also leads to the possibility of geometrizing quantum mechanics. The so-called “noncommutative spaces” are totally global in character; no local concept can be given any meaning. This in turn could lead to the unification of quantum mechanics and general relativity. The idea is that fundamental physics is based on a noncommutative geometry that is nonlocal; only at a higher level does the distinction between spatio-temporal geometry and physical dynamical processes arise. Even at this fundamental level, there can be an authentic, though generalized, dynamics. But here the distinction between singular and nonsingular is lost, undermining such ideas as the beginning of the universe and the concept of the individual. Instead, and unlike previous approaches in physics and philosophy, singularities are a part of our macroscopic perspective, but their distinctive character is meaningless at the fundamental level. Equally, nonlocal phenomena, such as those which the EPR experiment points to, are explained within the noncommutative approach.

In his closing sections, Heller shows how important theological concepts, such as causality, are reshaped by the noncommutative framework and its properties of timelessness and nonlocality. Causality becomes a “dynamical nexus” rather than a temporal ordering of cause and effect, a combination of a-temporal and nonlocal behavior that is fertile ground for thinking theologically about God as Creator.

McMullin, Ernan. “Formalism and Ontology in Early Astronomy."

According to Ernan McMullin, there are several crucial challenges to the relations between quantum mechanics and theology. First, the discussion of both quantum mechanics and theology often relies on a realist interpretation, but some of the most energetic critics of realism are philosophers of quantum mechanics. Second, in dealing with mechanics in general, and with quantum mechanics in particular, the move from the mathematical formalism to its ontological interpretation is highly problematic. The fact that quantum formalism yields two different ontologies (those of Bohr and Bohm) leads to further problems: how can this be so, and how is one to decide between them? Finally, one is faced with the “troubling strangeness” of quantum ontology.

In this essay, McMullin limits his concern to the history of the relation between formalism and ontology in astronomy in order to show that similar difficulties arose throughout this history. First he clarifies terms. Mathematical formalism tells us nothing about the world until it is interpreted in terms of measurable quantities, thus becoming a physical formalism such as quantum mechanics. Both the Copenhagen interpretation and Bohm’s interpretation are then second level interpretations between the physical formalism and ontology, and it is here that issues arise which divide scientific realists and nonrealists.

With this in place, McMullin turns to the early history of astronomy and finds similar difficulties in moving from formalism to ontology. He starts with Greek astronomy, where the celestial regularities invited explanation in terms of an underlying physical structure. Aristotle, for example, regarded the mathematical formalism of concentric spheres as implying a causal explanation based on the ontology of a single interlocking system of spheres. But Apollonius and Hipparchus showed that a complex, mathematical model based on the eccentric and the epicycle could explain the data equally well, and this posed something similar to the problem we find in quantum physics: how can two radically different ontologies explain the same phenomena? Perhaps there is no genuine connection between formalism and ontology; if there is, the criterion of “saving the appearances” is clearly not enough in itself to reveal it.

McMullin next examines the Copernican revolution from geocentrism to heliocentrism. Although both formalisms could equally well save the appearances, Copernicus’s system offered important advantages. It eliminated unwanted elements of Ptolemy’s formalism and it explained phenomena that were mere coincidences in the Ptolemaic system. In addition, Copernicus could specify the order of the planets outwards from the sun, their distances from the Earth, their periods of revolution, and their retrograde motion. All of this disclosed such a degree of harmony that, for Copernicus, it proved the reality of the earth’s motion and pointed to God as its creator. McMullin closes this section by challenging Kuhn’s assessments of the epistemic merits of Copernicus’s system.

Kepler, we are next told, strongly supported Copernicus’s realist arguments for the motion of the earth, both for religious reasons and because Copernicus offered such a convincing explanation of the phenomena. His extensive analysis of Tycho Brahe’s observations of Mars led Kepler to explain its orbit as elliptical and to suggest a physics that could make such an orbit possible. Kepler’s “theory of gravity” was analogous to magnetism and gave the Sun causal primacy in determining planetary motion. But it was only with Newton that we at last have a case “sufficient to warrant reasonable belief” in the heliocentric system. What then of ontology? Certainly Newton’s system employed terms like ‘force’ and ‘attraction’, but with his rejection of action-at-a-distance, the challenge of finding an ontology to match the formalism remained. As McMullin points out, this situation is obviously similar to the current problems in interpreting quantum mechanics.

McMullin closes by drawing a philosophical moral from this history of mechanical systems: moving from a valid formalism to an underlying ontology has always been a “contentious matter. Mechanical agency has all along proved to be an uncommonly elusive quarry.” On the other hand, this history has forced us to expand our intuitive notions of agency and, given the challenges raised by quantum mechanics, it seems clear that further expansion is necessary. He also offers us a theological moral from his historical account: if our explanation of motion at its most basic level is more complicated and mysterious than earlier generations imagined, how much more complex and mysterious must divine agency be, and how hesitant ought we to be in discussing it.

Polkinghorne, John. “Physical Process, Quantum Events, and Divine Agency."

Although it has been “spectacularly successful” in its predictive power, John Polkinghorne begins by stressing that we do not fully understand quantum theory. The central difficulty is the “measurement problem”: how do determinate macroscopic states (i.e., particular results) come about when a measurement is made on apparently indeterministic quantum states? Viewing this as a “collapse of the wavepacket” only renames the problem, since such a collapse contradicts the dynamical (Schrödinger) equation under which the wavepacket evolves smoothly. Niels Bohr spoke dualistically about classical and quantum worlds that had to intermesh, but this does not really work in principle since there is only a single physical reality in which even the classical apparatus is made of quantum constituents. Polkinghorne also finds unsatisfactory a statistical interpretation of quantum mechanics which refrains from speaking about individual quantum processes, including consistent-histories approaches. He then outlines various groups of proposals for interpreting quantum theory which seem more promising.

The first group starts with quantum theory as it is and attempts to resolve the measurement problem by including “decoherence.” According to the superposition principle, exclusive classical states (e.g., “here” or “there”) are admitted together as a viable physical quantum state (e.g., “here” and “there”). Superposition gives rise to “interference” effects suggesting the wavelike aspect of quantum states. Why, then, don’t we see superposition and interference in our everyday experience? Some have proposed that decoherence, which involves interactions between the quantum process and its radiative environment, helps solve the problem by rapidly minimizing all but one state and by canceling interference effects. The problem is that decoherence does not tell us why any particular outcome, and not one of the other possibilities, actually occurred.

The second group, “hoped-for physics,” believes that the interaction with large systems brings about the collapse of the wavefunction. In Polkinghorne’s opinion, the irreversibility of the behavior of macroscopic systems may provide a clue to how this happens, but it has not done so yet. A third group seeks what Polkinghorne calls “unknown new physics,” where, for example, the amount of matter involved in the interaction determines whether the collapse occurs, or where quantum gravity plays a crucial role. David Bohm’s work represents a fourth approach: “hidden new physics.” For Bohm, there is no collapse of the wavepacket, but Bohm’s approach offers no predictive advantages over conventional quantum physics. According to Polkinghorne, the choice between Bohm and Bohr has to be made on the basis of extra-scientific criteria, including metaphysical principles. Theologians who believe that divine action requires indeterministic physical processes have every right to prefer Bohr’s conventional approach, as long as they do not claim that science alone supports their choice. The final group appeals to “unknown but ‘nearby’ metaphysics” in reaching out to additional factors in nature to solve the problem. These include the role of consciousness of the observer and the many-worlds and many-minds strategies, which accept the formalism of quantum mechanics but actualize all states in the quantum superposition.

All of these proposals seem to assign a special role (or “preferred basis”) to spatial position in their formulation of and solution to the measurement problem. If so, this would imply a change in the way we think about physics, which, since its inception, treats all dynamical variables equally. They also limit their focus to laboratory measurements, and they may not be extendable to the wider context of natural physical processes. Polkinghorne then suggests that the standard classical account of macroscopic processes may need to be reconsidered. Complex classical systems point to the presence of a “pattern-forming causality of an holistic kind (‘active information’).” Perhaps the equations of classical physics are “downward emergent” approximations of a more complex account of macroscopic physics as well as “upward emergent” from quantum physics. Still, the unresolved complexities of quantum chaology pose a challenge to such an approach. Even the meaning of the term “quantum event” cannot be reduced either to occasional measurements or to the general unfolding of the wavefunction governed by the Schrödinger equation.

Finally, Polkinghorne turns to the theology of divine and human agency. Though autonomous in many ways, the metaphysical backing for such discourse should still take account of science, particularly where quantum mechanics and chaos theory suggest ontological openness. But many unresolved problems beset such attempts. Quantum theory may not be helpful because of the limited and episodic character of measurement events in which indeterminacy seems to hold. Moreover, although some quantum processes, such as gene mutation, may lead to macroscopic consequences, they do not seem to generate a basis for the “flexible actions of agents.” Chaos theory provides a more “flowing character” for agency. Of course, chaos theory is normally framed within a deterministic, Newtonian, context, but it could be given a wider framework. The real problem is how to combine chaos theory with quantum mechanics, and in the process, solve issues in quantum chaology. Polkinghorne believes that the best hope for future progress will lie in an increased understanding of the nature and implications of quantum chaology.

Redhead, Michael. “The Tangled Story of Nonlocality in Quantum Mechanics."

Michael Redhead’s essay is based on the assumption that nonlocality as instantaneous causal action-at-a-distance is to be avoided since it violates “the spirit of special relativity.” He therefore undertakes a meticulous examination of a variety of proofs of nonlocality in the quantum mechanical treatment of many-particle entangled states, seeking to detect and assess their assumptions.

Redhead starts with the assumption that relativity is more than a phenomenological invariance principle; instead it is grounded in the causal structure of spacetime. Specifically, Redhead claims that relativity entails the Philosophically Grounded Invariance Principle (PIP), which asserts that causal influences cannot operate outside the light-cone. If, alternatively, relativity entailed the First Signal Principle (FSP), it would disallow faster than light signals, where “signals” are controllable causal processes. David Bohm’s interpretation of quantum mechanics is both deterministic and, to many scholars, consistent with relativity since it does not allow superluminal signaling - although it does allow superluminal causal processes. But since Redhead believes that relativity entails PIP, and since PIP _ FSP, Redhead claims that Bohm’s interpretation violates relativity. He also objects to it on theological grounds, since its determinism does not allow “room” for incompatibilist divine action. He therefore turns to indeterministic approaches involving nonlocality in both nonrelativistic and relativistic quantum mechanics.

He begins with John Bell’s analysis of the (nonrelativistic) EPR argument, which delineated between two meanings of nonlocality: a) action-at-a-distance between individual particles, and b) nonseparability, in which at least some properties cannot be attached to individual particles. Bell’s argument, in turn, rested on assumptions involving joint probability distributions and determinism, both of which Redhead explores in detail. He then discusses algebraic proofs of nonlocality that seek to demonstrate that local hidden-variable theories are self-contradictory.

Next, Redhead turns to the search for a relativistic EPR argument. First, he reviews the problems encountered in seeking to translate the nonrelativistic EPR argument into a relativistic context, paying particular attention the need to reformulate the “reality criterion” (e.g., every element of physical reality must have a counterpart in the theory). A relativistic EPR argument must employ a relativistic wavefunction and must not depend on the existence of absolute time- ordering for space-like events. Redhead describes in detail one proposal for a relativistic reality criterion by Ghirardi and Grassi and its reliance on the truth of certain classes of counterfactual statements. Ghirardi and Grassi’s argument involves a distinction between what Redhead describes as OM-Loc, that the outcome of a measurement cannot be influenced by performing nonlocal measurements, and ER-Loc, that elements of reality cannot be created by performing nonlocal measurements. Ghirardi and Grassi claim to show that relativity and quantum mechanics are in “peaceful coexistence,” but to do so they must also claim that violating ER-Loc is more serious than violating OM-Loc. Redhead disagrees, but offers a further assumption which he calls the “Principle of Local Counterfactual Definiteness (PLCD).” With this he shows that Ghirardi and Grassi’s relativistic reformulation of the EPR argument is less general than they suggest; it is in fact limited to deterministic systems.

In his concluding section Redhead first argues that nonlocality seems unavoidable for any reconstruction of quantum mechanics which is both realist, i.e., in which all observables have sharp values at all times, and deterministic. We either turn to a stochastic hidden-variable framework or seek to understand correlations in terms of what Shimony describes as “passion-at- a-distance.” In the anti-realist option pursued by Ghirardi and Grassi, Redhead challenges the claim that the existence of action-at-a-distance is not a valid deduction from the EPR argument, but he then rescues the claim by the additional assumption of determinism. He regards his results as closing further gaps in the peaceful co-existence argument, but the “mysterious harmony” of quantum correlations remains “spooky” even if it does not involve causal dependence. For the anti-realist, the role of measurement is to actualize potentialities. But when quantum mechanics is applied to cosmology, where there is nothing “outside” the universe to serve as a measuring device, the realist option may be preferred, and with it the notion of nonseparability.

Redhead’s essay thus gives arguments for invoking either indeterminism or holistic nonseparability. The author sees these as having important theological implications: indeterminism is important for theories of divine action on particular occasions, while holism is an anti- reductionist thesis “which shows how every element of the universe has for its ground of being the totality of the whole, which pantheists would want to identify with God.”

Russell, Robert John. “Divine Action and Quantum Mechanics: A Fresh Assessment."

Robert John Russell develops and extends the thesis that, if one interprets quantum mechanics philosophically as pointing to ontological indeterminism, then one can construct a robust bottom-up, noninterventionist, approach to objective, mediated, direct divine action. In this approach, God’s indirect acts at the macroscopic level, understood as both general and special providence, arise in part from God’s direct action at the quantum level. God sustains the deterministic time-development of elementary processes governed by the Schrödinger equation, and God also brings about irreversible interactions that are not described by the Schrödinger equation (i.e., “measurements”) and that can lead to specific macroscopic effects. Thus divine action ultimately results in the regularities of our everyday world, which we attribute to general providence and describe by the laws of physics, and in specific macroscopic events which we view as acts of special providence.

Russell begins with clarifications and comments on methodology. His thesis does not explain how God acts or constitute an argument that God acts, but merely shows theological coherence of his theory of divine action with natural science. It is neither an epistemic nor an ontological “God of the gaps” argument. It does not reduce God to a natural cause; instead, God’s action is hidden from science. It does not propose that God alters the wavefunction between measurements or other such views. It opts for a bottom-up approach, since this seems the best way to discuss God’s action during the billions of years from the early universe to the evolution of primitive organisms on Earth; when sentient life is considered, bottom-up and top- down approaches should be combined. Finally, he responds to two questions. First, why should we take quantum mechanics seriously if it will one day be replaced? His response is that our alternative, classical physics, is wrong as a fundamental theory, and its depiction of nature as a closed causal system has already been thoroughly explored theologically. Second, how can we use quantum mechanics theologically if it can be given multiple philosophical interpretations? His response is that every scientific theory is open to multiple interpretations and that this poses a problem for all theological engagements with science. The key is that constructive theology can take a “what if” strategy, exploring the implications of one particular interpretation to its fullest without incurring the foundationalist problems of natural or physico-theology. Here he explicitly works within the Copenhagen indeterminist approach.

Next Russell turns to the measurement problem from the perspective of the Copenhagen interpretation. He distinguishes between the time development of the wavefunction governed by the deterministic Schrödinger equation, and the irreversible interactions between a quantum system and other systems to which the Schrödinger equation does not apply. Such interactions are routinely called “measurements,” but he claims their scope is much wider than usually acknowledged. It includes: micro-macro (e.g., the absorption of a photon by the retina), micro- meso (e.g., the capture of an electron by an interstellar dust particle), and irreversible micro-micro (e.g., proton-proton scattering in the presence of heavy nuclei) interactions, though it does not include reversible micro-micro interactions (e.g., proton-proton scattering in free space). The phrase “the collapse of the wavefunction” is used loosely to suggest “what happens” during a measurement, where the inapplicability of the Schrödinger equation and thus intrinsic unpredictability is taken as pointing to ontological indeterminism. The term “quantum event” can be defined as referring to this collection of ideas related to irreversible interactions.

Turning to theological issues, Russell first argues that quantum statistics (Bose-Einstein and Fermi-Dirac), as well as the Schrödinger time evolution and irreversible interactions, together lead to the classical world we interpret via general providence. He weighs arguments for viewing special divine action as either “ubiquitous” or “episodic” and concludes that “pervasive” is more helpful. He proposes that the spatial and temporal characteristics of the wavefunction and its collapse in an irreversible interaction point to divine action as both global and local. Finally he discusses scientific and theological challenges raised by special relativity, suggesting that we need a richer theological conceptuality of “time and eternity.”

He then discusses four crucial theological issues. First, does God act providentially in all quantum events, or only in some? Russell prefers the first option, though there are advantages and disadvantages to both. Second, if divine action and mind/brain top-down causality are both operative in acts of human volition, how do we avoid what Russell calls “somatic over determination”? Russell suggests that God acts in all quantum events until the rise of life and consciousness, after which God limits God’s action, leaving room for top-down, mind/brain causality. Third, why doesn’t God act to minimize suffering, disease, death, and extinction in nature? Russell proposes that we can give a more persuasive response to theodicy if we move from creation theology (and thus providence) to a trinitarian theology of redemption, particularly as developed by Wolfhart Pannenberg. This, in turn, leads to Russell’s fourth issue, which he sees as the crucial challenge to the theology-and-science discussion today: the meaning and intelligibility of the resurrection and eschatology in light of physics and cosmology.

Russell’s essay includes an appendix on philosophical problems in quantum mechanics, including a proposed “architecture of philosophical issues,” a discussion of Bell’s theorem, and a comparison of nonlocality and (in)determinism in Bohm and Bohr’s interpretations of quantum mechanics.

Shimony, Abner. “The Reality of the Quantum World."

In this paper, Abner Shimony describes two essential concepts in quantum mechanics. The first is the quantum state or wavefunction, which specifies all the quantities of a physical system “to the extent that it is possible to do so.” This caveat is crucial since, according to the Heisenberg uncertainty principle, not all such quantities have simultaneously definite values. The wavefunction does, however, give the probability of each possible outcome of every experiment that can be performed on the system. The second is the superposition principle, according to which new quantum states can be formed by superposing any two allowable states of the system. From these two basic ideas Shimony delineates three crucial features which distinguish quantum physics and our ordinary experience of the world: objective indefiniteness, objective chance, and objective probability. Thus quantum quantities before measurement are objectively indefinite, their definite but unpredictable value after measurement implies objective chance, and the probability of finding that value after measurement is objective. Shimony then describes in detail a fourth counterintuitive property: non-locality. Here two particles which once formed a single system and have been widely separated show an uncanny correlation between their properties, challenging the relativistic concept of locality (i.e., that effects cannot propagate faster than light).

How are we to handle these remarkable features? One way would be to reject the premise that the wavefunction gives a complete specification of the quantum state. Instead, there might be as-yet unknown, or “hidden,” variables at work that explain these strange features. In 1964, however, John S. Bell proved that the predictions of local hidden-variables models are incompatible with the predictions of quantum mechanics. Crucial experiments, such as those by Clauser and later by Aspect (and proposed in part by Shimony), vindicated quantum mechanics at the expense of local hidden-variables theories. Does this mean that quantum mechanics involves unacceptable kinds of nonlocal action-at-a-distance? Not according to Shimony, who points out that quantum correlations between separated parts of a system do not allow one to send information faster than light. Thus Shimony suggests that we think in terms of “passion-at-a- distance” instead of instantaneous action-at-a-distance.

Shimony then describes other experiments which reveal further elements of quantum strangeness. Delayed-choice experiments underscore the difficulties in interpreting how and when quantum properties become definite in the experimental context. Schrödinger-cat type experiments raise the possibility of quantum indefiniteness in the macroscopic world. Here something like an “irreversible act of amplification” is involved, but for Shimony, we may need to discover new physical principles if a full account is to be achieved. Finally neutron interferometry and the Aharonov-Bohm effect underscore additional highly nonclassical features of the quantum world. In sum, these highly nonclassical features of quantum systems raise profound philosophical issues for our understanding of the physical world.

Stoeger, William R. “Epistemological and Ontological Issues Arising from Quantum Theory."

As a prelude to the problem of divine action and quantum physics, William R. Stoeger explores the epistemological and ontological implications of quantum physics. Clearly a discussion of divine action in nature requires our confidence that scientific theories actually represent processes and features of the world that are, in some ways, independent of how we know them. Using these theories, we may then be able to constrain our description of divine action in important ways. But to gain confidence in our theories, we must first sift through their interpretations, assessing them in terms of their adequacy and fruitfulness; only those that survive this process will warrant further consideration. What then constitutes a “canon of adequacy” for our assessment?

Stoeger responds first by noting that there are several levels of interpretation involved: i) basic interpretation at the level of the physical theory itself (e.g., the probabilistic interpretation of the square of the wavefunction); ii) consistency and coherence both within the theory, and iii) with other physical theories (e.g., with special relativity); and iv) epistemological and ontological interpretations by which quantum theory may give us knowledge of the underlying reality. Clearly levels (i)– (iii) constrain level (iv) without determining it completely. Using levels (iii) and (iv), Stoeger argues that we can exclude both hidden-variable and other strongly deterministic interpretations of quantum theory. Next, he proposes a “principle of parsimony” in which we minimize our assumptions about what reality is like, allowing the results of quantum physics to “speak for themselves” even if the result is counterintuitive and puzzling. That our interactions with the quantum level are “recalcitrant and resistant” suggests that we are dealing with aspects of the world independent of our measurement of it. We may not have any direct knowledge of the underlying states which produce the phenomena we measure, but our experimental and theoretical knowledge place significant constraints on the properties these underlying entities can have. This assumption is warranted since the models we construct successfully predict and explain other phenomena. Stoeger relies on Ernan McMullin’s emphasis on retroduction to support these points. Still, he acknowledges that there may be many significant features of quantum reality that completely escape our detection. Some of these may never be knowable even in principle, while “reality for us” may have features that are not functions of the actual underlying features of the world. Stoeger’s essential metaphysical presuppositions, then, are the principle of sufficient reason (what we observe in some way points to an underlying cause) and the principle of relationality (the reality with which we interact is a part of a network of relations with processes and objects at other levels of the world).

Stoeger then briefly describes several key features of quantum physics, including nonseparability, quantization, objective uncertainty, complementarity, objective chance, correspondence, entanglement, measurement, and decoherence. Returning to his criteria for choosing an interpretation, Stoeger notes that “the family of Copenhagen-like interpretations” (including the consistent-histories approach) involves most of these features and is “by far the most satisfactory” interpretation, compared with hidden-variable and many-worlds interpretations. Thus our indirect knowledge of reality is “weakly objective”: an independent reality exists and is manifest to us through our interactions with it, but we cannot assess our knowledge of it from these observations. Regarding the question of the epistemic and ontological status of the laws of nature, Stoeger sees these laws as but imperfect and incomplete descriptions of those that obtain in nature. Moreover, these laws are descriptive, not prescriptive.

Several implications for divine action follow from this. God’s universal creative action is realized behind the “veil” of natural laws, and it appears in the form of these laws. Isolated cases may seem to violate these laws, and God’s action may occur at the level of consciousness and personal relationships. Special divine action may involve top-down influences on matter and thus transcend science. Divine action, as acts of love and care, may be taken to be interventions only if we assume that the laws hold absolutely under all circumstances. The key problem, particularly compared with that of human agency, is our lack of understanding of how an immaterial God can act on the material world.

Tracy, Thomas F. “Creation, Providence, and Quantum Chance."

Uncertainty regarding the meaning of “the acts of God” pervades modern theology, according to Thomas Tracy. Critical historical and literary techniques have deepened the problem of interpreting biblical texts and the connection they make between story and history, while the natural sciences have changed the intellectual context of interpretation by offering an account of nature without appeal to transcendent causes. On the one hand, scientific methods do not rule out divine action, and scientific findings are not inconsistent with it. Ironically, theologians from deists to liberals such as Schleiermacher, Bultmann, and Kaufman, have worked with a closed causal picture of the world that they feel is authorized by science. They have taken this to be incompatible with divine action in the world, leaving either a God who only sets the world’s initial conditions or whose actions violate the laws of nature. But contemporary natural science does not necessarily lead to a deterministic metaphysics. Tracy cites two possible responses. First, a theologically sufficient account of God’s particular actions in history might actually be developed that still limits God to being the creator of history as a whole. Second, God can be said to act in particular cases without intervention in history if one can defend an indeter ministic interpretation of natural causes. It is here that quantum physics might be relevant.

Though Tracy’s focus is on the second response, he starts with an extended treatment of the first one since he does not want to underestimate its resources and since he explicitly assumes it as background for the second response. Here God’s fundamental action is the free intentional act of creating the world, which continuously gives being to the created world in its entirety but which cannot be understood by analogy with human agency. Moreover, God gives to created things active and passive causal powers, so that God’s action is direct in causing their existence, but indirect in acting through them and their powers to produce results in the world. Thus even though God acts uniformly in all events, we can affirm God’s objectively special action in two ways: particular events may reveal God’s overall purposes, and they may play a special causal role in shaping history. It is interesting to note that, in identifying this second way, Tracy is making an important addition to the typology developed in previous CTNS/VO publications and republished above, where only the first way, called “subjectively special action,” was discussed.

If, however, the structures of nature are on some level(s) indeterministic, God can act to determine the outcomes of natural processes without disrupting their intrinsic causal properties. Here God could be thought of as acting in all such chance events or in just some of them, though the latter generates conceptual puzzles. Moreover, the extent of ontological chance in nature will influence the extent of God’s action in nature. Indeterminism also plays a role in “incompatibilist” accounts of free human action. Here again God could be thought of as acting in all human acts, as John Calvin and Aquinas seemed to imply, or as empowering people to make their own choices. Both options raise further issues, including the problem of evil and the ultimate redemption of the world. Indeed, faith in God’s redemptive action in history provides “a compelling theological reason” to argue for a noninterventionist account of divine action and thus an indeterministic interpretation of nature.

A number of challenges, however, face any attempt to use quantum physics for such an account. First, quantum physics can be interpreted in a variety of different ways including the Copenhagen interpretation, Bohmian nonlocal hidden-variable determinism, many-worlds determinism, and so on. While it is legitimate, even unavoidable, to prefer one of these on theological grounds, we should stress that others are available and their theological use in each case is tentative and provisional. A second challenge is the measurement problem found in some of these interpretations. Does this overly limit the occasions of divine action, or is “measurement” more universal in nature than some interpreters suggest? And how do the worlds of quantum processes and observable objects relate? A third challenge is to show that indeterministic transitions associated with measurement can produce a difference in the course of the everyday world. Laboratory equipment, of course, involves precisely this sort of “amplification,” but so do natural processes, such as vision and genetic mutation. In conclusion, Tracy stresses the primary importance of God’s creating and sustaining the world, and within this, God’s indirect action through created causes and, possibly, God’s direct noninterventionist action at points of underdetermination in natural processes.