Sunday, December 31, 2006

God in the Machine: What Robots Teach us About Humanity and God (Book Review)


Anne Foerst says we must grant personhood to humanoid robots. Is this an act of worship — or sheer hubris?

God in the Machine:
What Robots Teach us About Humanity and God
.
Anne Foerst.
New York. Dutton, 2004. 196 pages.
$24.95 paperback.

{ Audio this essay @ 4.5min @ 3.51MB } A 1976 El Camino was my first car, one that I drove all through high school and through my military service. I loved that car. Yet, when I entered it into a summer demolition derby, no one accused me of abuse or neglect — much less torture or murder — when my sputtering, flaming machine finally gave up the ghost.

If Anne Foerst were there, she might have leveled those accusations at me.

In her book titled God in the Machine, Foerst maintains that strong bonds can develop between humans and cars, computers, and other devices. Unfortunately, her focus on these bonds taints the bigger message of what robots can teach us about ourselves and about our relationship to God.

Foerst brings an interesting background to the discussion of robotics and religion — she apparently liked to sneak back and forth between studying Paul Tillich’s systematic theology at Harvard Divinity School and discussing the development of Cog, a famed robot at the Massachusetts Institute of Technology robotics lab. By doing so, she discovered two incongruous cultures: At the divinity school, people were antitechnology and thought her quest to combine theology and artificial intelligence was unnecessary, she said. At MIT, people were suspicious of theologians. This experience helped Foerst develop her rectified view of technology-and-religion in God in the Machine.

The strongest section of the book deals with the Golem tradition from Jewish writings of the 13th and 16th centuries. Golems are helpful servants that can get out of hand if the intention of the human creator is not pure and worshipful of God. The Golem tradition teaches us that we are created creators, and that these artifices — like us — would enter into a system of sin and ambiguity.

The book also has its weaknesses that show the kind of category mistakes theologians can make when confronting advances in cognitive science and AI.

Take, as an example, the issues of bonding and community, which are pivotal concepts used throughout the book. A poignant stance on these can be found on the last page, where Foerst writes: “As we are communal and bond with nonhuman entities, these narratives will necessarily include some nonhuman critters.” Although she never explicitly defines what version of bonding is being invoked, she does think it depends more on emotional settings rather than on abstract human-like qualities.

The problem here is that people can emotionally bond with all sorts of entities that strain the notion of what can be considered part of the community and what cannot. For instance, an early AI program called Eliza had people becoming so dependent on it that the program’s author, famed MIT researcher Joseph Weizenbaum, eventually ended its use. Even Weizenbaum’s secretary would ask him to leave until she had finished sharing her intimate personal matters with the machine. Weizenbaum was rightly concerned by all this — and even more so when people accused him of violating their privacy when he considered recording all interaction with Eliza.

If there is one lesson to be taken from Foerst’s book, it is that we tend to humanize all things we bond with. When Foerst and other like-minded theologians can finally distinguish between what is human (or, more properly a "person") and what is not, they will be able to resolve the skeptics’ new mantra for AI’s relationship to religion: All that glitters is not God’s.


*This review appeared in the May 2006 issue of Science and Theology News.

**image: Peter Menzel Photography.



.O.

Labels: , , ,

0 Comments:

Post a Comment

<< Home