While 4D movies are slowly making their way into a number of select theaters that are equipped to show these kinds of productions, a trio of researchers from the University of Toronto in Ontario, Canada, are working on a way to make 4D movies more accessible.
Yuhao Zhou, Makarand Tapaswi, and Sanja Fidler are paving the way for automatic 4D cinema, a concept that would make it possible to add 4D effects into already completed movies, reports Phys.org.
“Right now, all these effects are created from the first phase of production. We’d like to automate this kind of process for movies that were not originally created for 4D cinemas,” says Zhou, who is a fourth-year undergraduate in the university’s Edward S. Rogers Sr. Department of Electrical & Computer Engineering.
The three have even written a paper on the topic, published by the University of Toronto Department of Computer Science, along with other materials that illustrate how 4D effects can be automated through an AI-based method.
Presented at the Computer Vision and Pattern Recognition conference held in Salt Lake City, Utah, on June 19-21, the paper details the creation of a database of 4D effects, such as physical interactions, water splashing, light, and shaking, put together with the help of a neural network designed by the three researchers.
— U of T News (@UofTNews) July 6, 2018
If you’re curious how this would work, Zhou explains the whole thing.
“Usually with 3D movies, film-goers wear glasses and sit in a chair. With automatic 4D cinema, the neural network would process 2D and 3D movie information, feed it into the chair, and simulate the effects.”
While moving physical interactions into 4D may still be a thing of the future — at least until technology develops pressure sensors that simulate touch, notes Tapaswi — the other types of sensations could be easily translated into 4D using an AI algorithm that classifies and detects these effects.
The neural network developed by the trio uses machine learning to first extract the desired features for the effects, including movement and audio, from short clips. After that, it analyzes a combination of visual and acoustic information to predict what the 4D effects are and where they would go in a longer video clip.
To exemplify what automated 4D cinema means, the team used excerpts from several famous movies, including Avatar, Now You See Me 2, Thor: The Dark World, and Iron Man 3, on which they highlight different possible types of 4D effects that can be added to increase the cinematic experience.
“The camera is your input,” adds Tapaswi, who is a postdoctoral fellow of computer science at the university. “But in this case, you want to experience not only what the camera sees, but also one of the characters — relive how the characters felt shaking and so on.”
The method is still being refined and new, exciting features could make its way into the algorithm as well.
“We want to have a feature where you can just flip a switch and experience what characters are feeling,” discloses Zhou.