A.I. and Film

In a new video essay, contributor Sean Maymon discusses A.I. in front of and behind the camera.

Transcript

Artificial intelligence has always had a place in cinema. From 2001: A Space Odyssey to Star Wars, Metropolis to The Terminator the idea of A.I. has seeped into our culture and become the subject matter of some of the greatest sci-fi films. But what happens when A.I. becomes more than fiction? What happens when it becomes a part of how we make films?

Let’s start by taking a look at the beginnings of artificial intelligence in cinema which first appeared in front of the camera. The earliest depiction of A.I. was Fritz Lang’s 1927 film Metropolis and, to be frank, this technology didn’t really start out on the right foot. In Metropolis a scientist creates a robot called a Maschinenmensch or “machine-human” that takes on the image of a woman named Maria and leads the workers of Metropolis to destroy and flood their city. This is a very early example of cinema depicting A.I., which was far from legitimate realization, in a negative light.

This same fear of humankind manufacturing its own destruction can be seen in every era of film. Hal 9000 in 2001: A Space Odyssey, Gort in The Day the Earth Stood Still, the Terminator in The Terminator. Today shows like Black Mirror and films like Morgan and even The Avengers continue this tradition as the true realization of this technology looms closer on the horizon. And while films like Wall-E or Star Wars portray A.I. in a far more positive light. Many still fear the implications of this technology. So what happens when A.I. starts to move from in front of the camera to behind the scenes?

HAL_9000

This clip is from the first ever screenplay written by artificial intelligence. The A.I. named Benjamin was created by director Oscar Sharp and A.I. researcher Ross Goodwin. The script Benjamin produced is obviously very rudimentary and the film was produced to serve a comic purpose rather than a dramatic one but it’s fascinating nonetheless. Benjamin functions very much like the feature on iPhones that predicts which word you’ll use next in a text message. It  was “fed” hundreds of science fiction screenplays and then made to write it’s own script by predicting a pattern of words based on the recurring patterns it recognized in these screenplays. The challenge to the directors and actors was to then interpret this script and make it a reality, which was not an easy task. Stage directions like, “He was standing in the stars and sitting on the floor” aren’t very easy to interpret. Despite the comic nature of this project, it raises some really interesting questions.

Ross Goodwin makes a great point. Maybe A.I. can serve as a collaborator or source of inspiration in the creative process rather than completely take it over. Maybe the cycle of generation and interpretation he mentions is the best way we can push forward and innovate in the world of cinema and elsewhere.

A.I.’s involvement in the filmmaking process doesn’t stop at pre-production though. The capabilities of artificial intelligence may soon play more and more of a role of what we see on the screen.  This clip isn’t from a low quality version of Blade Runner. It’s not even from the original film footage. The clip you’re seeing has been generated by an A.I. that has used a process called ‘deep learning’ to recognize and encode individual frames of the film. This footage comes from Terence Broad who taught the A.I. to recognize frames from Blade Runner in comparison to frames not from the film and then taught it to reconstruct 200 pixel reductions of each frame. After learning these skills, the A.I. went on to eventually learn how to order the frames, from start to finish, exactly as in the movie. So essentially this A.I. is creating a representation of what it ‘sees’ rather than rehashing images from the actual film. The result was close enough to the original Blade Runner footage to cause a copyright dispute between Broad and Warner Brothers which ended in Broad’s favor.

BROAD_REDUCTION

Other, similar techniques are already in use as well. This clip is a demonstration of Neural Style Transfer in which a neural network, like in the previous example, ‘learns’ an image and then transfers it’s qualities onto another image in order to imitate the source image’s style. This technique is fairly rudimentary and involves quite a bit of human intervention but it’s already being used in film. Kristen Stewart recently directed a film called Come Swim in which Neural Style Transfer was used in the opening and closing shots. She also co-authored a paper on the technology.

The world of film is already being affected by A.I. technologies. All of the aforementioned examples raise fascinating questions about their uses and implications. So where do we go from here?

In a world and an industry that is changing so rapidly and unpredictably, it’s hard to know what’s coming next. Recent films like Rogue One and shows like Westworld have already demonstrated the ability to de-age actors or portray one’s who have passed away, IBM trained Watson to edit a trailer for the film Morgan, the concept of A.I. playing a role in cinema and elsewhere is becoming more real and more controversial by the day. In the end the debate, at least in relation to the film industry, comes down to a few questions: Can computers be creative? Can they make art? This sounds like the plot to one of the science fiction films mentioned earlier but it’s quickly becoming a reality, and these questions are important to be debated and explored. Maybe we should look to one of films great auteurs to sum this up.


Author: Sean Maymon is a senior at Columbia College Chicago where he is studying Cinema Arts and Sciences. This Fall he is the Media Archivist Intern at Facets.

291 Total Views 1 Views Today