The writer and director of the brilliant AI film, Ex Machina, talks to Michael O'Sullivan.
Novelist and screenwriter Alex Garland's name has appeared on many movies. Even since his novel The Beach was adapted into a film in 2000, he's written the script for two more Danny Boyle-directed films - 28 Days Later and Sunshine - as well as the adaptation of Kazuo Ishiguro's Never Let Me Go and the comic book Dredd.
But now Garland has turned director himself with Ex Machina, a smart and sexy sci-fi thriller about a computer geek (Domhnall Gleeson) who is recruited by a reclusive tech entrepreneur (Oscar Isaac) to test the artificial intelligence of a rebellious female robot named Ava (Alicia Vikander).
Ex Machina is the latest (and best) in a recent string of similarly themed films that grapple with the theme of robots and artificial intelligence (or AI). We picked his brain about the roots of this seeming cinematic obsession.
What does our enduring fascination with robots and artificial intelligence say about us?
The truth is, I don't know. With that caveat, I have been thinking about this for a few years, and I can try to make an educated guess. It certainly looks like there's something in the zeitgeist about it. If there had been a seismic breakthrough in artificial intelligence research, say, three years ago - because that's roughly the cycle of film-making - then you could understand it. But there hasn't been a breakthrough, so my instinct is to look somewhere else.
Where?
I personally look at the fact that there are these enormous tech companies that have power that seems to grow exponentially. There's something disproportionate about the incredible rapidity of the way they stake a claim on the world. There's also a sort of adjunct quality, which is that we access these tech companies via cellphones and computers and tablets, and yet we don't really understand how they work. Yet conversely, these things seem to understand quite a lot about us.
It's actually the tech company, but it can seem to be the machine, because it will anticipate the thing that we're trying to type into the search engine. It understands something about our shopping habits and things that make us feel slightly uneasy.
On top of that, we've known, even predating Edward Snowden's revelations, that largely what these companies were doing was storing massive amounts of information. It gets called "big data", but it's also quite small data. It's very specific and tailored to an individual. On an unconscious level, and also on a reasonable level, it makes us uncomfortable. I actually feel that these narratives come more out of that than anything specific to do with artificial intelligence.
Isn't our discomfort with technology contradicted, to some degree, by our insatiable appetite for it?
Without question, yeah.
Transcendence and Chappie each feature a dying character who seeks a kind of immortality by transferring his consciousness into a machine. Are these movies a form of artistic wish fulfillment?
I know for a fact that for some of the people who are actively involved in dropping enormous amounts of money into AI research, that is explicitly and openly their motivation. That is, to upload themselves in order to live longer in another form.
Why does that fantasy hold so much appeal?
Because we're mortal. Even religious people who believe in an afterlife will have a sense that something very fundamental about them is not going to continue. My approach to it was not to look at the individual extending his own lifespan, but more to see the creation of AI as a parental act. So the AI will have its own life that will extend beyond, where the "child" goes off and does its own thing, and the parent unfortunately is left behind.
Isaac Asimov famously articulated three laws of robotics, the first of which states that a robot "may not injure a human being or, through inaction, allow a human being to come to harm". Yet these laws are routinely violated in most contemporary robot movies, including your own.
Those Asimov laws have always felt to me like a real problem, because they preclude free will. You could debate whether humans have free will, but we certainly think we have it. We act as if we have it.
While maybe, in reality, we're living in The Matrix?
Absolutely. I could always understand the logic, but they're not actually laws. There is no science fiction court that's going to prosecute me because I've failed to observe them. I think they're problematic anyway. If you were able to go to a computer and you said, "I'm going to switch you off" and the computer said, "I don't want you to switch me off" and if you had reason to believe that this wasn't just an automatic statement - that the computer had some kind of emotional internal life - at that point you've got an ethical problem. I suspect that if you had a sentient machine, you'd have to start giving it pretty much what we currently call human rights.
Ex Machina wrestles with themes that many robot movies don't even seem to be aware of.
I avoided all these other films because I didn't want to get intimidated or frustrated by them. My intention was to tell a story that is effectively on the side of the machine. It was not a moralising, cautionary tale about not messing with God's work. The rules we make about each other really relate fundamentally to our minds. That's why we can cut down a tree but not murder a human. As to the film, yeah, it attempted to run straight on at that stuff. It's an ideas movie, I guess.
What:Ex Machina by Alex Garland When: Screening Wednesday July 22 and Saturday July 25