Google

Friday, April 25, 2008

Latest news and views from The Mindful Hack ...

I must write an index next week, which may (= will) limit posting. I posted ten items today, and link them here for your convenience:

Things we know but cannot prove: Another nail in the coffin of materialism.

The fours be with you! (You will be "fours"ed to cooperate with this words/numbers game. (Hey, it's Friday night!)

Altruism: Why it can't really exist but why it does anyway

Evolutionary psychology: Eliot Spitzer is a kludgebrain!, psychologist opines (but so are we all)

Mind and medicine: The placebo effect - Did your doctor just prescribe you a quarter teaspoon of coloured sugar? Maybe ...

Materialism: When the store is on fire, hold a fire sale:
Excerpt: So this is the latest pseudo-explanation of the soul? I could do better myself! How about this: Minds that are accustomed to think in terms of a future have difficulty grasping the idea that there is no future after death.

Way simpler, to be sure, but materialists wouldn't buy it because I forgot to drag in the Paleolithic cave guys telling stories around the fireside - the staple of evolutionary psychology.

Fitna: A thoughtful Muslim's response The predicted riots largely didn't happen, but where to go from here?
Excerpt: And while we are here: Dial-a-mob/rent-a-riot behaviour is NOT copyright to Middle Eastern Muslims. I ran into the same thing among the American Ivy League elite in May 2005, when the New York Times bungled a story I broke on my other blog, The Post-Darwinist, claiming that a film about to be shown at the Smithsonian was "anti-evolution." It wasn't; it did not even address the subject. But zillions of Darwinbots, as I called them, behaved exactly as if it had. It's a good thing that no one gives them sharp objects to play with.

Rupert Sheldrake's guide to New Atheism (which makes it sound like New Coke, really)

Can a transplanted heart lead to transplanted thoughts? Well, maybe, but the mechanism might be fairly conventional.

Why science without God destroys itself: Because the alternative idea of a multiverse is a step into magic, that's why

By the way, regarding the fact that I no longer accept or display comments, a word of explanation, originally published at the Post-Darwinist (Sunday, April 20):

Some readers have written to ask why I disabled the Combox. Two reasons:

1. Time pressure. Interest in both the intelligent design controversy and the mind vs. materialism controversy has exploded. I cannot manage all my current desks nor train people fast enough without cutting back on all non-essential activities.

2. Canada's current "human rights" regime has created significant difficulties for bloggers with comboxes for which they are responsible. I am one of the people campaigning for reform, but I don't know when or whether reform will happen. If you want to know more about that kind of thing, go here or here.

If reform happens and I have the time, I will reenable the combox.

Since then one person has written to me, disputing that bloggers like me face serious dangers in Canada. I wish that person were correct, but he is not. I can only recommend checking out the Web sites linked above, and also "global content provider" Mark Steyn's recent column in historic Maclean's Magazine (Steyn himself has been charged and so has Maclean's.)

Things we know but cannot prove - another nail in the coffin of materialism

Recently, I highlighted a talk that Prof. Robert Marks, distinguished professor of electrical and computer engineering at Baylor University, gave to Baylor's American Scientific Affiliation branch on things we know but cannot prove (and it doesn't matter how big computers get):
This is because of a new startling mathematical idea from algorithmic information theory (AIT): There exist things that are true that cannot be derived from fundamental principles. Some things are true simply because they are true.

Many claim God cannot be proved. (Although I'll show you Godel's short mathematical proof of God's existence). There are some things we know exist that we can prove we will never know.
I asked Marks how it went, and he wrote back to say,
This is mind bending stuff. Stephen Hawking, for example, is becoming agnostic in his belief there is a single theory that describes all of physics. There look to be things that are true simply because they are true. They cannot be derived from first principles. And there exist things, like Chaitin’s astonishing number between zero and one, that we can prove we will never know. The foundations of algorithmic information theory has been around since the 1930’s, but scientists and mathematicians are only recently appreciating its significance.

Algorithmic information theory and string theory make the science fiction I read as a boy seem boring.

I watched Marks's PowerPoint on line, and highly recommend it.

One thing Marks relates, is that computer pioneer Alvin Turing lost his faith and became assertively materialist after a boy he loved died of bovine tuberculosis. It wasn't anything he had learned about computers after all. It was - as so often - a personal issue.

Some of his observations (but they're much less fun without the graphics):
"There are things that are true and known to exist that will never be proven nor computed."

"We are at an undisputed edge of naturalism in computing and math. There is no TOE. Does science have a TOE? If so, will we ever know we are at the edge?"

Bob, some people won't know that it even is an edge, because they are reductionist materialists. They will just assume that their brains are cobbled together kludges from the evolutionary past. Possibly, they hope they can get a microchip implanted in their brains that will either cause them to understand or at least create the illusion of understanding. (See the story about Gary Marcus and the kluges, below.)

By the way, Marks starts his presentation with an interesting arithmetic word game. Try The Four's Be With You! with colleagues on your coffee break.

Labels: ,

The fours be with you! ... and double cream, half sugar, please

Here's the number word game called The Four's Be With You, from Prof. Bob Marks's presentation on things computers will never do:

Spell a number. (Say, t-w-o.)

Count the letters. (3)

Spell that number. (t-h-r-e-e)

Count the letters of that number. (5)

Count the letters of that number. (f-i-v-e = 4)

Prof. Marks says, you will always end at four/4 letters.

Huh? I tried it a few more times:

Twenty 6
six 3
three 5
five 4

thirty-six 8 [I suppose I should not count hyphens]
eight 5
five 4

Okay, I am trying this one more time. I have work to do:

one hundred thirty eight [I suppose I should leave out the "and"] 21
twenty one 9
nine 4

This seems like a good icebreaker, while we wait for late arrivals at a meeting. But you have to wonder about people who figure this kind of stuff out. Like what weren't they doing? (Stocking the shelves? Mending fence? Answering the phone at the Complaints desk? Might explain a lot ... )

Altruism: Why it can't really exist but why it does anyway

"Charity, do-gooding, philanthropy it's all just selfishness masquerading as virtue," notes Jim Holt in the New York Times Magazine (March 9, 2008), citing what he calls the cynical view:
In modern times, the theory that each of us, despite occasional appearances of self-sacrificial nobility, is ultimately and invariably looking out for No. 1 got a big boost from Darwin's theory of evolution. By the logic of natural selection, any tendency to act selflessly ought to be snuffed out in the struggle to survive and propagate. So if someone seems to be behaving as an altruist - say, by giving away a fortune to relieve the sufferings of others - that person is really following the selfish dictates of his own genes. The evolutionary psychologist Randolph Nesse confessed that he slept badly for many nights after absorbing this supposed discovery, which he called "one of the most disturbing in the history of science."

"Supposed" discovery hits the notion square on the head - and, I hope, puts it out of its misery.

The view Holt describes is not, strictly speaking, a "cynical" view. Evolutionary psychology has, as a central project, the idea of showing that all human behaviour is analogous to the behaviour of primate apes. That is the reason - indeed the only reason - why genuine altruism is considered a problem. Absent that project, genuine altruism would be regarded as merely one facet of human behaviour, and could be interpreted in a variety of ways. But it would not need to be explained away.

Anyway, Holt reviews the supposed reasons that people "really" help others, but finds them unconvincing. He cites an interesting experiment at the University of Oregon:
Nineteen students were given $100 each and told that they could anonymously donate a portion of this money to charity. The students who, on average, donated the most showed heightened activity in the pleasure centers of their brain as they gave up the money. Their generosity was accompanied by a neural "warm glow."

[ ... ]

But can an objective reason, by itself, motivate selfless generosity? In the Oregon brain-scanning experiment, curiously enough, two of the students who were the most liberal in their charitable giving were "outliers" who seemed to get no neural reward for their generosity. They did not benefit from the warm-glow effect. Yet they were outstandingly altruistic anyway.

Yes, because some people are just like that. It may be that no apes are like that. But so?

Labels:

Evolutionary psychology: Eliot Spitzer is a kludgebrain!, psychologist opines (but so are we all)

Some idea of what awaits if we read psychologist Gary Marcus's Kluge: The Haphazard Construction of the Human Mind (Houghton Mifflin) can be divined from his comments at the Huffington Post on the recent resignation of New York governor Eliot Spitzer, caught in a prostitution scandal:
This is not just a case of a man being led about his hormones, to the exclusion of the rest of his brain, but something more complicated: a case in which an extraordinarily intelligent man used all of his rational capacities to form a track-covering plan -- yet seemingly focused none of his cognitive wherewithal on evaluating whether that plan was worth pursuing in the first place.

[ ... ]

Why does this happen so often? The answer, in a nutshell, is this; evolution blew it. When our fancy new deliberative reasoning systems evolved, evolution, which lacks foresight, took what amounts to the lazy way out, crudely grafting the new capabilities onto the older ancestral systems, with nary a thought as to how the two would work together. The ancestral mate seeking systems that led Client 9 [Spitzer] by the nose thus still receive extremely high priority, whether or not their actions are in the interests of our minds as a whole.

I've heard many unconvincing explanations of the age-old conflict between what we want to do and what we ought to do, but this is so far the least convincing.

Any convincing explanation must take into account Spitzer's reputation as a ruthless foe of corruption. A reputation often becomes a sort of shell - lots of hollow space inside. Evolution didn't fail Spitzer; he just found his shell too heavy after a while. It's better to walk humbly ... away from trouble (but Spitzer doesn't need anyone to tell him that now).

Gary Marcus also asks in "Total Recall" in the New York Times Magazine (April 13, 2008) how much we would pay to have a memory chip implanted in our brains to double our short term memory. But, in his view,
... techniques like that can only take us so far. They can make memories more accessible but not necessarily more reliable, and the improvements are most likely to be only incremental. Making our memories both more accessible and more reliable would require something else, perhaps a system modeled on Google, which combines cue-driven promptings similar to human memory with the location-addressability of computers.

However difficult the practicalities, there's no reason in principle why a future generation of neural prostheticists couldn't pick up where nature left off, incorporating Google-like master maps into neural implants. This in turn would allow us to search our own memories - not just those on the Web - with something like the efficiency and reliability of a computer search engine.

The Next Big Thing will probably be a project for erasing the memories of things we would rather forget, so we no longer recognize our connection to our real past.

Labels: ,

Mind and medicine: Did your doctor just prescribe you a quarter teaspoon of coloured sugar?

One of the most misunderstood functions of our minds is their role in organizing our bodies' efforts to overcome illness. In The Spiritual Brain: A neuroscientist's case for the existence of the soul, Montreal neuroscientist Mario Beauregard and I looked in some detail at the medical and neuroscience evidence for the placebo effect - the effect of simply believing that you have received a powerful medication (whether or not you have).

Drug studies show that many people get better just because they have received a sugar pill that they are told is a powerful new drug. (It is an authentic example of your mind acting on your brain and body.)

Recently, Time Magazine ran a short article by Laura Blue asking "Is Your Doctor Prescribing Placebos?" And guess what - many are. Almost half of physicians surveyed in a recent study admitted as much:
Among the doctors who prescribed them, one in five said they outright lied to patients by claiming a placebo was medication. But more commonly, the physicians came up with creative ways to explain, saying the substance might help but wouldn't hurt, or that "this may help you but I'm not sure how it works."

A lot of unnecessary angst is generated around the ethics of using placebos, some of it captured in Blue's article. Some say it is wrong for the doctor to deceive the patient, but the underlying problem is that neither doctor nor patient readily accepts the role the patient's mind plays in kickstarting the healing process. Sometimes, the kickstart requires the "oval indigo pill just released by MegaPharma's top doctors ... " (with the same chemical content as the sugar bowl at home).

Maybe the oval indigo pill drama wouldn't be necessary if we accepted the role our minds play in our health?

Labels:

Materialism: When the store is on fire ... hold a fire sale!

Prestigious science journal Nature, originally founded in the 19th century by Darwin's materialist associates, seems to want to go down fighting. In a recent Futures feature in the April 9, 2008 edition, Nature writer Neale Morrison offers "All Over, Rover" a science fiction scenario in which materialists prove that there is no soul. (Citation: Nature 452, 780 (10 April 2008) | doi:10.1038/452780a)

And just when it is so obviously not happening.

Similarly, in "Biased brains, messy memories," Sandra Aamodt reviews Kluge: The Haphazard Construction of the Human Mind by Gary Marcus (Houghton Mifflin) and A Portrait of the Brain by Adam Zeman (Yale), both asserting materialist theories as if there wwere any reason to believe t hey are correct.

Marcus (Kluge), for example, concludes,
that evolution has left us with something of a mess. In an argument reminiscent of David Linden's The Accidental Mind, Marcus makes his case by describing cognitive difficulties, including false beliefs, linguistic ambiguity, impulsiveness and mental illness.
The blame, he asserts, rests with our imperfect memory, "arguably the mind's original sin". Perhaps we would reason more effectively if the brain could store and retrieve data as accurately and as simply as a computer. Instead we must contend with a limited system. Brains locate memories by matching them to the current context rather than having unbiased access to all of our experiences. This contextual dependence makes it hard during an argument, for example, to recall how often our spouse does the housework, because thinking of one failure inclines our brains to remember similar situations rather than contrary examples.

You mean, self-interest has nothing to do with such lapses? In which case, Dawkins's famed selfish gene (another theory that was supposed to explain everything) must be sleeping on the job.

Similarly, Adam Zeman outlines
how brains that are predisposed to tell stories and that attribute actions to agents rather than chance might lead us to believe in an immortal soul. His own view is that this is "no more than a wonderful fiction". (Marcus makes the same point less gently.) Zeman struggles with science's failure to find an emotionally satisfying replacement story, conceding that such questions may be more in the realm of art than science.

So this is the latest pseudo-explanation of the soul? I could do better myself! How about this: Minds that are accustomed to think in terms of a future have difficulty grasping the idea that there is no future after death.

Way simpler, to be sure, but materialists wouldn't buy it because I forgot to drag in the Paleolithic cave guys telling stories around the fireside - the staple of evolutionary psychology.

Sorry guys, Cave Thug just didn't fit my screenplay. Anyway, I don't believe the materialist theory because I think our minds' intuition is correct. Mario Beauregard and I set out reasons for our view in The Spiritual Brain: A neuroscientist's case for the existence of the soul.

Labels: ,

Fitna: Thoughtful Muslim reacts to the challenges

Recently, my Muslim journalist friend Mustafa Akyol (Turkey) saw Fitna, which addresses the worldview of Islamic political extremists. The film is causing angst in the world's nanny states about possible outrage among dial-a-mobs, but I am glad to say that rioting has largely fizzled.

I really appreciated Mustafa's thoughtful comments in Turkish Daily News, of which I have excerpted a few. Arguing against legal action - or overreaction generally - to the film, he says,
The film actually does not lie or cheat. Such violent or angry Muslims do exist, and so do the belligerent passages in the Koran. What the film does is to cherry-pick them. There are also many messages of tolerance, compassion, and peace in the Koran. Using the same method of purposeful selection, one could also make a movie titled “Islamic Agape,” which would include the scenes of smiling Muslims and benevolent verses.

Moreover, one can use “Fitna”s selective method to propagate against most other religions – such as, say, Judaism. Actually if you focus on the radical groups among the Jewish settlers in Israel, you can find a very similar language of hatred, and even acts of terrorism such as the mosque massacre perpetrated by Baruch Goldstein in Hebron in 1994. It is also remarkable that such fringe Jewish fundamentalists, like the followers of the late radical Rabbi Meir David Kahane, use passages from the Hebrew Bible in order to justify, and even amplify, their fervor.

Actually certain parts of the Old Testament, and most notably the Book of Joshua, would overshadow any sura (chapter) of the Koran in terms of militancy. But the overwhelming majority of the world’s Jews know that the Book of Joshua, which tells the war of the Israelites against the pagan Canaanites, is a historical record which does not address today’s realities. Similarly, when they read Koran’s chapters about Prophet Muhammad’s war with pagan Arabs, most Muslims regard them as historical anecdotes. But a worrying number of Muslims, such as the ones that “Fitna” has captured, think differently.

Why are worrying numbers of Muslims listening to radicals?
What makes them believe in a scripture-driven militancy is the same thing that influences radical Jewish settlers: They are in a sociopolitical context which radicalizes them. They believe that their values, identities and very lives of their children are in danger – and they conclude they are fighting the same existential war that Joshua or Muhammad fought centuries ago.

Mustafa thinks that stabilization and modernization of Muslim societies will discredit Muslim radicals. Historically, these forces have usually worked that way. For example, Karl Marx was convinced that British workers would be the first to embrace his communism. In fact, they never did. Britain was both a stable enough country that dialogue was possible and a modern enough country that life was improving for most people anyway. So workers settled for unions and shorter working hours (and pubs, telly, and National Health, of course). Communism was forced on populations that couldn't really choose, in the aftermath of World War II - and thrown off almost worldwide in 1988.

People who think the world is in turmoil now are usually not old enough to remember the 1940s. (I don't remember, but I did grow up in the shadow of World War II, and heard the stories, far into the night, of people who did remember pretty clearly.)

As a traditional Catholic Christian, I identify with Mustafa's approach. For example, I often hear attacks, insults, and put-downs of the Catholic Church - which happens to be the oldest and possibly the largest voluntary association on Earth. So at any given time, detractors always find something to trash somewhere. On the off chance that they should come up short in the present, they can always look to the wealth of past misdeeds or prophesy our future doom. And so? So what exactly?

And while we are here: Dial-a-mob/rent-a-riot behaviour is NOT copyright to Middle Eastern Muslims. I ran into the same thing among the American Ivy League elite in May 2005, when the New York Times bungled a story I broke on my other blog, The Post-Darwinist, claiming that a film about to be shown at the Smithsonian was "anti-evolution." It wasn't; it did not even address the subject. But zillions of Darwinbots, as I called them, behaved exactly as if it had. It's a good thing that no one gives them sharp objects to play with.

Labels: , , ,

Investigating atheism?: Start here

Rupert Sheldrake, whose books I have just ordered from the library, has put up a site called Investigating Atheism. He seems to be even-handed, though I gather he thinks that the "New Atheism" is as much a crock as I do (it is the "New Coke" of atheism):
the 'new atheists' have had a mixed reception, not only among the religious (as is to be expected) but also among fellow atheists and agnostics, who have often accused them of oversimplifying the issues.

The purpose of this site is to set these contemporary 'God Wars' in their historical context, and to offer a range of perspectives (from all sides) on the chief issues raised by the 'new atheists'. We hope this will encourage more informed opinion about the issues, discourage oversimplification of the debate, and deepen the interest in the subject.

Sheldrake provides many useful references and links. Note especially, the New Atheist goals, where we learn, for example, that

Belief in God and evolution are not compatible.

Religion tends to subvert science.

Atheism is not discredited by the 'atheist tyrannies' of Hitler and Stalin.

Religious education of children is 'child abuse'.

Hardly a recipe for social peace.

Labels: ,

Can a transplanted heart lead to transplanted thoughts?

A friend wrote to ask about an interesting problem: Whether our physical hearts can influence our thoughts? He had in mind an article in a not-quite-reliable source that claimed that a heart transplant patient had changed her thinking, to resemble the thinking of the heart donor.

Of course, I immediately thought of an old movie whose name I can't remember (if you can, please e-mail me at oleary@sympatico.ca). A wealthy, tiresome, and desperately selfish woman discovers that she is going blind. Her doctor arranges a corneal transplant, telling her that the donor was a much-loved priest. The operation is a success, but much more remarkable is her transformation into a caring, sensitive soul. She sees the world with new eyes! everyone says.

Later, the doctor reveals the truth: In reality, her new corneas came from a condemned murderer. Her transformation, however, came from within, because she believed it possible.

But then, of course, there's The Eye (2002) in which a corneal transplant recipient whose donor was a murder victim suffers increasingly evil visions ... plus a host of schlocksters that would be urban legends if they hadn't hit the screen first. (Now you have to pay to see the tales instead of hearing them free around the coffee urn.)

Anyway, I wrote back to my friend, saying,
The question I would like to know the answer to is this: Did the transplant patient change in order to reflect the actual donor or what she was told about the donor?

I suppose that most doctors put a good "face" on the donor, to avoid the proposed recipient turning the organ down. The donor might be described as "a devoted father of four and a loyal team member" rather than "a cocaine addict shot during a gunfight over a bad drug deal." The thing to see here is that these descriptions might apply to the same person, seen from different angles - the funeral oration vs. the police record, for example.

Also, heart transplantation is very major surgery, in which the patient is technically dead (except for medical equipment) during the procedure. Afterward, most patients must lead a more cautious and restricted life - or die despite the transplant. So the effect of the ordeal and its aftermath on the patient must also be factored in.

My friend wrote back to say that in the case recounted, the recipient knew nothing about her donor and had only investigated the question after feeling that she had changed her thinking. To which I replied, yes, but perhaps people speak "nothing but good" of the dead in these cases ...

Anyway, I'm not a hyperskeptic. I just think that these factors must be considered. That is what makes responsible social science such a tricky business.

As Mario and I discovered while working on The Spiritual Brain, this kind of what-are-we-really-measuring question came up when doctors were assessing life changes following near death experiences (where one experiences oneself as dead and looks back on one's life).

Some researchers argued that the mere fact of having a heart attack would cause patients to become more concerned about relationships and less concerned about success. However, investigation showed that that was not the case. Near death experiences were far more commonly associated with a change in attitude to life than mere close encounters with death were.

Labels:

Why science without God destroys itself

Recently, a commenter asked me if I would post my recent ChristianWeek column, "Why science without God destroys itself," and here it is (with a couple of links):

Why science without God destroys itself

by Denyse O’Leary

In recent columns, I have looked at the big promotion campaign for materialist atheism, based in part on the claim that atheism is supported by science. The claim is without substance, so far as I can see. On the contrary, the fine tuning of the universe in which we live so much testifies to a divine Mind that great scientists who conspicuously lacked orthodox piety have been compelled to admit that. Yes, many deplore the obvious inference, and try to find a genuine loophole (Arthur Eddington’s term). But admit the problem they must, for they can hardly deny it.

Besides, as University of Waterloo chair of physics Robb Mann recently pointed out to my adult night school class at the University of Toronto, discoveries about the nature of our universe made during the last few decades have only confirmed and strengthened the case for the Creator. Not only that, but the most promising candidate for emptying the universe of God - string theory - is in crisis, if not in ruins. And its companion theory - cosmic inflation - if it were to succeed, would demolish science.

We’ve hardly space here to unpack a complex debate, enlivened by public squabbles between prominent scientists. But here are a couple of key points:

One good candidate to replace God was string theory. Briefly, it pictures the elementary particles of our universe as similar to notes resonating on a guitar string. These strings - if they exist - originated in weird, exotic events that occurred before the Big Bang. And how does that dispense with God? Because string theorists hope to show that there was never really any beginning to the universe, such as the Big Bang. Think about that: It means that, even if God exists, God is superfluous to the existence of the universe. You can still believe in God if you want, but God is not part of the explanation of how the universe came to exist. The universe originated in perhaps unknowable prior events. However, the mathematics underlying string theory has come under severe criticism recently, for generating all possible values (like the results you get when you try dividing by zero).

Many scientists have been working hard to find evidence for an infinite number of universes (a multiverse). Their bottom line argument: Our universe happens to look fine-tuned or designed. But infinitely many others look different. So our universe could be the way it is randomly. The movie What the Bleep Do We Know? introduces one version of the multiverse.

Cosmic inflation theory is an increasingly popular multiverse theory. It holds that our observed universe (which is currently expanding only slowly) is a tiny part of a much more rapidly expanding structure. That structure's existence and its different rates of inflation (rather than God) explains both the Big Bang and fine tuning of our universe. In fact, our universe is simply one of an arbitrarily large number of tiny parts that happen to expand more slowly than the whole. The other parts are infinite additional universes. Welcome to the multiverse!

Many arguments against the multiverse involve complex physics reasoning, but Dr. Mann offered my students one that is much more basic: In a multiverse, everything that we can self-consistently imagine - objects appearing by magic, for example - can actually happen. We can no longer rule out events as impossible. Remember, “impossible” means “not possible according to the laws of this universe.” But there are no longer any limits to that. You don’t believe in magic? But how do you know that another universe, where magic exists, has not overlapped ours and thus created magic? Getting rid of God by arguing for a multiverse - far from advancing science - destroys science!

The older philosophers knew this, I think. That is why even thinkers who imagined that they had “seen through” popular piety still believed in the “God of the philosophers”. Their remote God’s self-consistent reality and unvarying laws underwrite the possibility that we can come to understand our universe. A multiverse is incomprehensible by definition.

And - will this become a trend? - just this month, philosopher Antony Flew, who abandoned a half century of atheism on account of design in the universe, published There IS a God (Harper One 2007). I'll have more to say about Flew’s book in my next column.


Journalist Denyse O’Leary (http://mindfulhack.blogspot.com/) is the author of By Design or by Chance? (Augsburg Fortress 2004), an overview of the intelligent design controversy and co-author, with Montreal neuroscientist Mario Beauregard, of The Spiritual Brain: A neuroscientist's case for the existence of the soul (Harper 2007).

Labels: ,