Inside the mind of Alan Turing, the genius behind “The Imitation Game”
The upcoming film – and a BBC interview from 1951 – explore the enigmatic thinking of the computer pioneer and the relationship between human and machine
By S. Barry Cooper, PhD Posted on 13 November 2014
New — Letter to the Editor in The Washington Post
The Feb. 21 front-page article "Hollywood short-circuits Turing's greatest feat" pointed out many misleading statements in the film about Alan Turing and his accomplishments.
There were many distortions to the real story, but I think we should all accept Hollywood's practice of stretching the truth for entertainment value. The greater issue at heart in science biopics like "The Imitation Game" and "The Theory of Everything" about Stephen Hawking is that they shine a light on the little-known heroes of science, without whom we would not have the healthy, technologically advanced, vibrant society we enjoy today. ...
The Imitation Game, a movie about computer pioneer and World War II codebreaker Alan Turing, will be released in the UK tomorrow and the US on November 28. Based on the biography Alan Turing: The Enigma by Andrew Hodges and the screenplay by Graham Moore, the movie stars Benedict Cumberbatch (a distant cousin of Turing), Keira Knightley and Matthew Goode.
This is a commentary by Professor S Barry Cooper, lead editor of Alan Turing: His Work and Impact (Elsevier, 2013).
Professor Cooper's essay is followed by a book chapter featuring the script from a BBC interview with Turing from 1951.
- Dr. Brad Fenwick, Senior VP for Global Strategic Alliances at Elsevier: Letter to the Editor in the Washington Post: "Give Hollywood a little leeway"
- Interview with Prof. Cooper in Daze: "Alan Turing expert dissects The Imitation Game"
- Prof. Cooper's blog post in The Guardian: "The Imitation Game: how Benedict Cumberbatch brought Turing to life"
- Prof. Cooper's comments in The Wall Street Journal: "Why We Needn't Fear the Machines — A Basic Truth: Computers Can't Be Replacements for Humans"
- Column by Olivier Dumon in the Huffington Post: "Innovations in Science: Passing the 'Turing Test'"
What is a computing machine?
For Alan Turing, it meant knowing how a bit of the world really worked — in comprehensive detail. In his famous 1936 paper "On Computable Numbers," he introduced what we now term the "Turing machine," based on what "human computers," following instructions, did in the days before we had real computers.
Turing's machine was a simulation ― an imitation ― a logical vision of a future physical computing machine and what it could do. Imitation and reality, control and what lay beyond control, a wild and dangerous interface to which Alan Turing brought genius and amazingly prescient insight. His courage and foresight ― he was a pioneer computing to the limits of what we could scientifically know ― still guide our explorations of what it is to be a thinking human, and what it is to play with imitations.
Once, the American logician Anil Nerode commented to me, "You would not be a mathematician if you did not like pain." Like was the wrong word, of course, but I knew what he meant. The main point of that amazing paper by a 24-year-old Turing was its main conclusion: almost everything is not computable by a Turing machine. And the rest of Turing's life was spent looking for human routes into the incomputable, mathematically and personally ― searching for his lost friend Christopher Morcom ― and stretching human partnership with the machine to its limits.
Morten Tyldum's new film The Imitation Game is full of the subtleties of the relationship between human and machine; like a mathematical proof, it seeks out the real. It is scriptwriter Graham Moore's coaxing out of Turing's secret knowledge, the enigmatic thinking explored by Andrew Hodges' quite literary and very special biography of the great man.
I did not ask Andrew yet if he has seen the film, and how he likes it. He earlier went public with his criticism of the movie for making John Cairncross, exposed as a spy for the Russians in 1951, a member of Turing's Enigma team, when everyone knows he worked with Colossus in a completely different part of Bletchley Park. It's not good just retorting that the film is "not a documentary." Why would the script-writer do such a thing? Interviewed at the Toronto International Film Festival by CraveOnline, Graham Moore described his commitment and attention to detail in his long gestation of what to me is a marvelous script:
In some sense I've been working on this script since I was a teenager. I heard of Alan Turing when I was a teenager in Chicago. I was a huge computer nerd. I went to space camp. I went to computer camp. I even went to programming camp. Once I heard his story, I wanted to write about him. He had a very tragic life story. In 2010 I had a meeting with producers who had optioned Andrew Hodges wonderful book, Alan Turing, which was the first published biography of Turing and the most seminal and inclusive. But since Andrew's book has been published, there's been a half dozen other biographies on him. There was a play called "Breaking the Code" that was great and a number of other novels and biographies. So there became a lot to draw from, but Andrew's book was key.
Let's focus on just one of Moore's historical reconstructions. Leaving aside the transposition of Cairncross into hut 8, and the manipulation involved in the using of the decrypted info — let us look briefly at the movie account of the end of Turing's life. This connects with another aspect: the narrowness of treatment of Turing's scientific achievements, which has been negatively commented on by some more literally-minded reviewers. So (spoiler alert) why give Turing's Manchester house a huge machine called Christopher, with its associations with the Bombe, when we know very well — as Graham Moore certainly did — that at that time, Turing was heavily involved in formalising the mathematics of emergence of patterns in nature?
What Moore is doing is giving the right triggers to us understanding a unity of Turing's approach to "computing the world" while bringing out Turing's personal isolation: The obsession with the machine called Christopher is indicative of the lifelong search for computation beyond the limits, a search appearing in quite different guises in in the realities of our book Alan Turing: His Work and Impact. And Moore's machine leitmotif intensifies the irony of Turing's sad demise following the great adventure and achievements at Bletchley Park.
Moore's approach is extremely subtle. One review I read was so well informed about Turing's work, but so adrift about the way in which the film complements what is in our book. The film cannot and should not attempt to engage with the specifics of Turing's thinking on computability.
I'd have loved five minutes near the end of the movie with Turing explaining his ideas on morphogenesis. But what the film does is draws out a coherence of the thinking which is actually the engine of cutting edge science today, framed by still "arcane" conceptual frameworks, in a quite beautiful and dramatically valid way. Alan Turing: His Work and Impact is an Aladdin's Cave of Turingesque wonders, with an overpowering richness of content and range of thought. In The Imitation Game, Graham Moore is essentially turning what's in Turing's troubled mind into embodied dramatic narrative.
Moore spent years on that script, with huge devotion to the mind of Alan Turing. The criticisms are at the level of those editors of Bruckner's symphonies who tried to edit out the "strange silences" inserted into the flow of the music. Creative people at the top of their game do know what they are doing. The failures are not so often based on simple decisions about methodology.
[pullquote align="right"]In The Imitation Game, Graham Moore is essentially turning what's in Turing's troubled mind into embodied dramatic narrative.[/pullquote]
But just as the mathematical proof is both beautiful and painful for the mathematician, with its imperfect mapping of mental complexities, so this extraordinary and classic film confronts those familiar with the history with an imitation that takes seemingly reckless risks with the historical formalities, while taking us to an appreciation of the semantics and psychology of Turing and his quest, quite remarkable and unforgettable in its intensity. To be honest, it took me half my first viewing to understand what Moore's agenda and methodology were, and a second viewing to traverse levels of meaning more comfortably.
Interestingly, much of this deadly serious imitation game is present in the discussion in the transcript of a long-lost BBC radio broadcast kindly contributed to our book Alan Turing: His Work and Impact, and featured below. What is so special about Turing's presence in the conversation of these four very interesting thinkers is the sense of the mediating mathematics. It is potentially the sort of mediation that Newtonian mathematics performed in turning Robert Hooke's intuitions, that gravity follows an inverse square law, into a scientific revolution which changed our world for 300 years — and which underpins the sense of a level of hidden knowledge permeating the film. One can but wonder at the level of interchange, over 60 years ago.
Of course, both Turing and Max Newman shared links with Cambridge and Manchester, and Newman was a hugely important influence and protector of Alan Turing throughout his adult life. Both were mathematicians. Richard Braithwaite — the longest living of the participants, born in 1900 and 90 when he died — was a distinguished Cambridge philosopher with a special interest in science and religion. And Geoffrey Jefferson was a neurologist, a Fellow of the Royal Society like Turing and Newman and, living and working in Manchester, was very interested in the Manchester Mark 1 computer for which Turing wrote the first programming manual.
Just as Sir Isaac Newton used mathematics to push back the bounds of our understanding of the movements of the planets and other heavenly bodies, Alan Turing was already making the mathematics to take us into the strangely embodied, and even more strangely disembodied, informational world we live in today. He seeks to map out mathematically the wilds where logic meets information. And The Imitation Game movie is the machine which bears us to the meaning intrinsic to, yet beyond, the mundanities of historical incident.
Mathematics is the gatekeeper to the informational mysteries. We have to thank Alan Turing for attempting to guide us through the gate, as in this small excerpt introduced by Jack Copeland, Professor of Philosophy at the University of Canterbury in New Zealand and author of various books about Turing. The conversation is reproduced in full, along with Copeland's commentary, in the nearly 1,000 pages of our book Alan Turing: His Work and Impact.
Watch the movie trailer
Alan Turing on the BBC, 1951
Turing's lecture "Can Digital Computers Think?" was broadcast on BBC Radio on 15 May 1951 and again on July 3rd. In modern times, it was virtually unknown until 1999, when Copeland included it in a small collection of unpublished work by Turing ("A Lecture and Two Radio Broadcasts on Machine Intelligence by Alan Turing," in Machine Intelligence 15) and again in his book The Essential Turing (Oxford University Press, 2004).
The previously published interview text, reproduced here, is from Turing's own typescript and incorporates corrections made in his hand. It was published in our book in the chapter "Can Digital Computers Think?: Intelligent Machinery: A Heretical Theory: Can Automatic Calculating Machines Be Said To Think?"
In that chapter, Copeland writes:
In this broadcast Turing's overarching aim was to defend his view that "it is not altogether unreasonable to describe digital computers as brains." The broadcast contains a bouquet of fascinating arguments, and includes discussions of the Church–Turing thesis and of free will. There is … a priceless analogy that likens the attempt to program a computer to act like a brain to trying to write a treatise about family life on Mars – and moreover with insufficient paper.
The broadcast makes manifest Turing's real attitude to talk of machines thinking. In (his book?) Computing Machinery and Intelligence (1951), he famously said that the question "Can machines think?" is "too meaningless to deserve discussion," but in this broadcast he made liberal use of phrases such as "programming a machine to think" and "the attempt to make a thinking machine." In one passage he said: "our main problem (is) how to programme a machine to imitate the brain, or as we might say more briefly, if less accurately, to think."
In the following excerpt from the BBC interview, Alan Turing answers Richard Braithwaite, as they discuss what is special about human thinking and its embodiment:
Can Automatic Calculating Machines Be Said to Think?
Braithwaite: But could a machine really do this? How would it do it?
Turing: I've certainly left a great deal to the imagination. If I had given a longer explanation I might have made it seem more certain that what I was describing was feasible, but you would probably feel rather uneasy about it all, and you'd probably exclaim impatiently, 'Well, yes, I see that a machine could do all that, but I wouldn't call it thinking.' As soon as one can see the cause and effect working themselves out in the brain, one regards it as not being thinking, but a sort of unimaginative donkey-work. From this point of view one might be tempted to define thinking as consisting of 'those mental processes that we don't understand.' If this is right then to make a thinking machine is to make one which does interesting things without our really understanding quite how it is done.
Jefferson: If you mean that we don't know the wiring in men, as it were, that is quite true.
Turing: No, that isn't at all what I mean. We know the wiring of our machine, but it already happens there in a limited sort of way. Sometimes a computing machine does do something rather weird that we hadn't expected. In principle one could have predicted it, but in practice it's usually too much trouble. Obviously if one were to predict everything a computer was going to do one might just as well do without it.
Newman: It is quite true that people are disappointed when they discover what the big computing machines actually do, which is just to add and multiply, and use the results to decide what further additions and multiplications to do. 'That's not thinking', is the natural comment, but this is rather begging the question. If you go into one of the ancient churches in Ravenna you see some most beautiful pictures round the walls, but if you peer at them through binoculars you might say, 'Why, they aren't really pictures at all, but just a lot of little coloured stones with cement in between.' The machine's processes are mosaics of very simple standard parts, but the designs can be of great complexity, and it is not obvious where the limit is to the patterns of thought they could imitate.
Braithwaite: But how many stones are there in your mosaic? Jefferson, is there a sufficient multiplicity of the cells in the brain for them to behave like a computing machine?
Jefferson: Yes, there are thousands, tens of thousands more cells in the brain than there are in a computing machine, because the present machine contains - how many did you say?
Turing: Half a million digits. I think we can assume that is the equivalent of half a million nerve cells.
Braithwaite: If the brain works like a computing machine then the present computing machine cannot do all the things the brain does. Agreed; but if a computing machine were made that could do all the things the brain does, wouldn't it require more digits than there is room for in the brain?
Jefferson: Well, I don't know. Suppose that it is right to equate digits in a machine with nerve cells in a brain. There are various estimates, somewhere between ten thousand million and fifteen thousand million cells are supposed to be there. Nobody knows for certain, you see.
It is a colossal number. You would need 20,000 or more of your machines to equate digits with nerve cells. But it is not, surely, just a question of size. There would be too much logic in your huge machine. It wouldn't be really like a human output of thought. To make it more like, a lot of the machine parts would have to be designed quite differently to give greater flexibility and more diverse possibilities of use. It's a very tall order indeed.
Turing: It really is the size that matters in this case. It is the amount of information that can be stored up. If you think of something very complicated that you want one of these machines to do, you may find the particular machine you have got won't do, but if any machine can do it at all, then it can be done by your first computer, simply increased in its storage capacity.
Newman: Wouldn't that take a prohibitively long time?
Turing: It would certainly be shockingly slow, but it could start on easy things, such as lumping together rain, hail, snow and sleet, under the word 'precipitation.' Perhaps it might do more difficult things later on if it was learning all the time how to improve its methods.
Download the full chapter with the BBC interview
The actual interview begins on page 17
Elsevier Connect Contributor
S. Barry Cooper (@SBarryCooper) is Professor of Mathematical Logic at the University of Leeds in the UK. A graduate of the University of Oxford, his research follows that of Dr. Alan Turing in its focus on the nature of mental and physical computation.
Professor Cooper is the lead editor of Alan Turing: His Work and Impact (Elsevier, 2013), which won the prestigious RR Hawkins Award from the Association of American Publishers, as well as the 2014 PROSE Awards for Mathematics and Best in Physical Sciences & Mathematics, also from the AAP. Dr. Cooper's other books include Computability Theory, New Computational Paradigms, and Computability in Context. He is a leading advocate of multidisciplinary research at the interface between what is known to be computable, and theoretical and practical incomputability.
Follow Alan Turing on social media
By David A. Patterson, PhD | Posted on 21 May 2014
Computers come and go, but these ideas have powered through six decades of computer designBy Alison Bert and Michelle McMahon | Posted on 06 Feb 2014
Elsevier wins 10 PROSE Awards; other top honors go to HarperCollins, Cambridge University Press and MIT Press at the American Publishers Awards for Professional and Scholarly ExcellenceBy Michelle McMahon and Alison Bert | Posted on 10 Jun 2013
Turing’s work has influenced scholars in many fields; Editor Barry Cooper talks about compiling their commentary along with Turing’s writing