DeathEthicsEvolutionMagicPoetry & ProseReligionScienceWar, Violence & Terrorism

On Norbert Wiener’s God & Golem, Inc.

wiener_godandgolem
God & Golem, Inc.

While read­ing Annalee Newitz’ intriguing blog post on io9 about the his­tory of the word cyber, I came across the name Norbert Wiener (not Weiner — get it straight, you Eng­lish­ers) who had intro­duced the term Cyber­net­ics as “the study of con­trol and com­mu­nic­a­tion in machines and liv­ing beings”. His works include the book God and Golem, Inc.: A Com­ment on Cer­tain Points Where Cyber­net­ics Impinges on Reli­gion, and that title imme­di­ately caught my eye. Stud­ies of the inter­ac­tion between sci­ence, tech­no­logy, and reli­gion always interest me a lot, as do Golems and Jew­ish folk­lore, so Wiener had sold it to me eas­ily.

G&G is some­thing of a long essay, rather than a fleshed-out book. In it, Wiener explores some moral and reli­gious aspects of tech­no­lo­gical advance­ment, par­tic­u­larly related to his own area of cyber­net­ics. Among his main sub­jects are the ques­tion of life and cre­ation, the ten­sion between cre­ation and self-rep­lic­a­tion, and as such the hier­archy of God-Man-Machine. The pos­sib­il­ity of self-repro­du­cing machines — for which Wiener provides some arcane evid­ence — puts strain on such a hier­archy, and on tra­di­tional con­cep­tions of life and cre­ation.

I rather liked the extens­ive sub-essay on (artifical) intel­li­gence as an aspect of liv­ing beings. Wiener dis­cusses extens­ively the case of computers/software learn­ing to play games. At the time of writ­ing (the early six­ties), pro­grams had already become quite good at games like check­ers and tic-tac-toe, and he cor­rectly pre­dicts the advent of expert chess com­puters. With the pos­sib­il­it­ies of machine learn­ing in mind — he returns to aspects of this later in the book — Wiener cau­tions against dog­matic think­ing, both in reli­gion and sci­ence, about con­cep­tions of life. In par­tic­u­lar, I think the main stretch of his argu­ment is that we should be cau­tious to over­em­phas­ise and essen­tial­ise the hier­arch­ical cat­egor­isa­tion men­tioned above: God-Man/An­imal-Machine.

The part about games takes a pecu­liar turn when Wiener ties it to the con­cepts of omni­po­tence and omni­science, which he refutes earlier in the book, both from a reli­gious and sci­entific stand­point. Tak­ing the concept of game rather broadly (includ­ing the mean­ing struggle, con­test), he dis­cusses the cases of God and the Devil play­ing for the pos­ses­sion of human souls, and the struggle for the throne of heaven after Satan’s rebel­lion. Pre­sum­ably, he argues, the Devil wouldn’t play if he didn’t have a shot at win­ning. I doubt whether this is a strong theo­lo­gical argu­ment at all, but it is import­ant for Wiener’s later dis­cus­sion of machines play­ing rather ser­i­ous games and their single-minded pur­suit of vic­tory.

A more import­ant par­al­lel that Wiener draws between tech­no­logy and (reli­gious) mor­als con­cerns the concept of simony/sorcery. What he means is that some people learn to con­trol powers that are bey­ond the com­pre­hen­sion of most other people — see my earlier dis­cus­sion of magic/technology. Wiener argues that, reli­gious or not, those who wield such powers have the moral duty to not abuse it for “vain and selfish pur­poses”.

Wiener sees these tempta­tions both in the West and in Com­mun­ist coun­tries, and in par­tic­u­lar he points towards the tend­ency and pos­sib­il­ity for people to shift away respons­ib­il­ity for their actions to sub­or­din­ates or super­i­ors — or to machines and sys­tems. Start­ing with rel­at­ively benign examples of repla­cing (manual and men­tal) labour­ers by mech­an­ised work­ers, he later turns to respons­ib­il­ity in times of war. He men­tions Eich­mann as one example of this mind­set of jus­ti­fy­ing actions by deny­ing respons­ib­il­ity, and extends the line (in true Cold War spirit) to com­puters involved in or even mak­ing decisions in nuc­lear war­fare. Who is to blame for a nuc­lear destruc­tion of the world if no human actu­ally pushed a but­ton? In other words: Wiener’s point is that tech­no­logy does not absolve us of the respons­ib­il­ity for con­sid­er­ing the moral implic­a­tions of our actions, includ­ing the cre­ation and use of tech­no­logy itself.

G&G some­times feels like a book that is only semi-coher­ent. Among the top­ics dis­cussed which I haven’t men­tioned here are the sci­ence of pros­thet­ics, the folk­lore of wish-grant­ing, and machine evol­u­tion. In the end, I also get the feel­ing that there def­in­itely is a com­mon thread run­ning through all these top­ics, but it is not that easy for me to pin­point it after read­ing the book once, and cas­u­ally. Wiener only allots him­self around 100 small pages, spa­ciously type­set, to cover all these issues, so per­haps he could have been more elab­or­ate in draw­ing all his strings together. As it stands, the book deserves fur­ther study when I feel like it some­where in the future.

Another thing that deserves fur­ther words from me are Golems. I love those. Maybe I’ll get around to it next year.