You are here: Home Research Filmed and painted

Filmed and painted

For the very first time, video clips redesigned by computers to match the style of Henri Matisse or Pablo Picasso have been created without a hitch

Vincent van Gogh has recently entered the video-making arena. A real film scene shows a couple on the Freiburg marketplace near the cathedral.  They both let their eyes wander. The red-brown cathedral is looming against the deep-blue sky. In the background the Alte Wache can be seen. Market stands are located in front of it. Suddenly the colors and contours change. The cathedral is glowing yellow, blue, orange. The sky is decorated with colorful swabs. Did van Gogh film a cartoon? At least that is what the sequence looks like.  “It runs without jerking and flickering. Very impressive,” emphasizes computer scientist Thomas Brox, professor of pattern recognition and image processing at the University of Freiburg. Up until now films would flicker after a so-called style transfer. Using this process, computers overlay art styles such as those of van Gogh, Henri Matisse or Pablo Picasso onto film sequences. The 23-year-old computer science student Manuel Ruder has now eliminated the problem that caused tremors elsewhere: with his process, edited films operate jitter-free.

Media frenzy and top marks

Praise has come from all kinds of places, including the magazine Wired,  the US publication Technology Review, the British Daily Mail, etc. Nearly 80,000 people have viewed the video clip by the Brox working group. The expert for image analysis is thrilled: “It is unusual to have such resonance from the media for a student project, but it is also of the highest academic quality.” For that reason, Ruder recently received an invitation for the German conference on pattern recognition. “The reviewers gave his project top marks,” says Brox. Even the film industry has requested to learn more about Ruder’s process.

“Style transfer is nothing new,” the student explains. Computer scientists from Tübingen had already transferred art styles onto individual images in 2015. Brox and Ruder thought it would be neat if they could transfer the idea onto videos – and they weren’t the first to think that. Other working groups had already attempted it. But the color surfaces flicker in their videos, the optics don’t flow, bur rather hobble along. The computer had staged each picture individually. So he could not see that the pictures were parts of a series. The style transfer came up slightly differently every time and the videos jerked. Using Ruder’s process, the animated film Ice Age flowed seamlessly, softly and true to color in a cave painting style. Darth Vader from Star Wars appears like a creation from the German expressionist Heinrich Schlief and yet still moves around.  

"We always have the computer look at the past," explains Ruder. The machine creates a global image statistic, discovers relationships and understands: This is a movie! Ruder has also introduced a control that “punishes” aberrations between two images. The edges of moving objects and hidden surfaces are excluded. “We afford the transfer algorithm more freedom in these problematic areas.” A random element allows you to improvise a bit more generously. Maximum freedom prevails in panning when new surfaces appear at the edge of the picture. The computer is allowed to invent them but they have to match the rest of the image. Ruder developed a multipass algorithm so that nothing flutters.

And thus everything is in flow with Miss Marple in the style of van Gogh’s “Starry Night” or with Jungle Book à la Uffe Boesen. Nevertheless, the technology does have its limitations, says Brox. Angela Merkel’s New Year’s address as a Kandinsky imitation would be possible, but Angela Duck à la Disney would not. “The more expressionistic and artistic the style is, the better style transfer works,” says the 39-year-old professor. It becomes more difficult with graphic role models such as Keith Haring or comic books. Besides, substantive details go missing, says Brox with regret. “Faces rarely look better when using style transfer.”

One hour of computing time per minute of film

After applying style transfer, porn films would be less focused in more ways than one. A particular Website operator couldn’t have known that when he asked. Brox and Ruder were much more honored, however, when an LA film studio utilized their style code for a music video designed for the Cannes Lion Festival. A documentary filmmaker wanted to transfer an interview with an artist using her artistic style. “Unfortunately, her face didn’t get enough of it and the background got too much,” says Ruder. The student wants to optimize the process for his master’s thesis, which is due next semester. One goal is to accelerate the process. The computer computes each image 200 to 500 times, which equals about an hour per minute of film. That is too slow for full-length films and apps.

“The algorithm should recognize large movements better and address faces in particular,” says Brox. He and Ruder wouldn’t be sad if the Photoshop developer Adobe or the computer animation pros from Pixar gave them a call. The computer scientists are searching for ideas to attract the folks from Hollywood. Working out a deal, then with a ton of manual labor and a financial load in the millions. Maybe it will work out fast: Ruder only needed two months from conceptualization to implementation of his idea. “But I was able to rely on several pre-existing materials.” But in order to connect and adapt them, he had to write source codes and enter new territory, something Thomas Brox and Manuel Ruder will continue to do. They want to bring individual recordings to cinematic life, thereby repeating an historic step through computer technology and teaching images to run.

Jürgen Schickinger / article from uni'leben 03'2016