Monday, March 09, 2009

Mind: Yet another effort to explain to materialists why minds are not like computers

In "Why minds are not like computers," in The New Atlantis (Winter 2009) Ari N. Schulman explains:
Hmmm. What can go wrong with that reminds me of "The Computer's First Christmas Card," "Merry Chrysanthemum". And as a result,
Thus it is only partially correct to say that a computer performs arithmetic calculations. As a physical object, the computer does no such thing—no more than a ball performs physics calculations when you drop it. It is only when we consider the computer through the symbolic system of arithmetic, and the way we have encoded it in the computer, that we can say it performs arithmetic.
Okay, but what about the famed "Turing test" that would show that computers can think?
... they indicate crucial errors in AI researchers’ understanding of both computers and minds. Suppose that the mind is in fact a computer program. Would it then be possible to conclude that what’s inside the mind is irrelevant, as is supposed by some interpretations of the Turing Test? If we have some computer program whose behavior can be completely described as if it were a black box, such a description does not mean that the box is empty, so to speak. The program must still contain some internal structures and properties. They may not be necessary for understanding the program’s external behavior, but they still exist. So even if we possessed a correct account of human mental processes in purely input-output terms (which we do not), such an external description by definition could not describe first-person experience. The Turing Test is not a definition of thinking, but an admission of ignorance—an admission that it is impossible to ever empirically verify the consciousness of any being but yourself. It is only ever possible to gain some level of confidence that another being is thinking or intelligent. So we are stuck measuring correlates of thinking and intelligence, and the Turing Test provides a standard for measuring one type of correlate. Similarly, although social interaction requires communication in the form of such “input-output” as speech and hearing, it also requires two or more agents who experience that interaction: A teddy bear may provide a child with the same comfort and companionship—and elicit from the child the same responses—that another human being would, but we cannot say that it loves.
Schulman, whose comments I commend to all, is assistant editor of The New Atlantis.

Personally, I think that the artificial intelligence project is dead, but no one has got round to raising the funds for a decent burial.

See also:

Getting computers to pretend to converse is an extremely hard computatinal problem

Artificial intelligence: Conversing with computers or with their programmers?

Computers: Most engineers must have that they are not robotsArtificial intelligence: A look at things that neither we nor computers can discover

Can a conscious mind be built out of software?

Also: Mind vs. meat vs. computers - the differences

Let the machine read your mind (We offer an installment plan!)

Mind-computer blend: Who believes in this? Artificial intelligence:

Making the whole universe intelligent?

Brain cells release information more widely than previously thought.

(Note: Thanks much for all kind readers who are still here. I would have hoped to be here more often too. And thanks to kind PayPal donors. As media move online, it is best to support the news you want to see. - Denyse)

Labels: , ,