Google

Friday, December 19, 2008

I get mail: More on the awesome totally world-famous mind reading machine ...

Recently, I have been reading Plagues of the Mind by Bruce Thornton, about which I will say more shortly, on the epidemic of false knowledge that surrounds us these days.

False knowledge is what we know that ain't really so. And often common sense will help us see why it can't be so. Here's one example: We are told that the human and chimp genomes are 99.5% similar.

You heard that? Now forget it. Here is a more realistic summary, with explanation, offering a figure in the 70 per cent range - a figure you can believe is reasonable.

I use this example because no one has much difficulty figuring out the difference between a human and a chimpanzee, and if the 99.5% folk were right, it should be difficult. Otherwise, the only conclusion I can draw is that the genome is not as important a source of information as we once supposed.

This, by the way, has nothing whatever to do with controversies over common ancestry of humans and chimps. In theory, we have a common ancestor with silverfish too, but if you hear that your genome is 97.5% similar to that of a silverfish, you should suppose that some pretty important information is stored somewhere other than the genome.

Incidentally, if it were true that the human and chimpanzee genomes were 99.5 percent similar, that would shoot genetic determinism dead in the water. Just sayin' is all ...

In short, common sense is our best defense against epidemics of false knowledge blowing through the pop science media.

That said, in response to my skepticism about new mind reading techniques, someone wrote to assure me that, at the pace of current progress, there is no doubt we will soon use machines to read minds.

At first I was overwhelmed with a sense of "Where have I heard this before?" - which a more literary person would call "deja vu." Then I remembered where. Lots of places, actually. So I replied as follows:

Nice to hear from you,

I predict it is going to go like this one: “In ten years, computers will think like people.” Remember that? Now forget it.

Also there was the fellow who said that in ten years you could pull a CD of your genome map out of your pocket and say, “There, that’s me!” [I can't find this fellow is on line. Possibly, he has a day job now.]

Then there was Carl Sagan, confidently expecting the autobiography of a chimpanzee …. (Dragons of Eden), due to success in teaching chimps to think like people.

So far as I can make out, the way it usually goes is this: There are big advances quickly. But then we run up smack against a wall …

We can develop computer software to win chess games. But only the programmer knows that the software won, not the software. And then what?

The genome map? Genomes change over a person’s life, apparently, and acquired genes get passed on so … (and that’s only a little bit of the problem with genetic determinism … )

And after they taught the chimps to ask for sweets, the chimps never wrote anything except: gimme sweet – and similar astonishing thoughts.

The early big advances slowly grind to a halt before a vast sea of complexities that are not readily reducible – and then popular science mags move on to the Next Big Thing.

In my view, this “reading minds” business is just another instance of what one author – about whom I will be posting shortly – has called “the epidemic of false knowledge.”

False knowledge is like alchemy – it attempts impossible reductions.

The good news is that out of alchemy was born chemistry. Chemistry can explain for sure why we cannot turn lead into gold - and a host of other things besides, things that have mostly made our world a better place.

In the same way, I suspect that the false knowledge that grows up around “in ten years, we’ll be able to read minds” will help us to a better understanding of the real relationship between minds, brains, and bodies – which in turn will be good for human physical and mental health.

Note: I am indebted to philosophy and computer science professor Angus Menuge for his useful - and very thorough - discussion of reductions in science that work and reductions that don't work in Agents Under Fire. The subject is much more complex than pop science pundits suppose.

Labels: