A New Era of the Video Art with AI
- Chaeyoon Lee
- Mar 28, 2024
- 4 min read
The development of technology always affects the arts. These days, the boundaries of generative AIs have become more expansive and have entered video art.
The two most developed services for video-generating AI are Runway and SORA. Almost all AI experts agree that SORA by OpenAI is much more advanced in technology. However, the artists say that Runway is more artist-friendly."
"Game" on." It is a short post that Tribune's CEO Cristóbal Valenzuela uploaded right after the SORA show. Tribune's CEO is Cristóbal Valenzuela, who graduated from the Tisch School of the Arts. He was fascinated by movies and video arts and also studied programming. He is an entrepreneur standing at the intersection of technology and art.
The main motive of his establishment was one study about computational creativity. He thought there would be a new era for all artists, and he met two friends who are now the co-founders of Runway. Cristóbal Valenzuela looks back to when he first started his company by saying, "ng. "Originally, I had no intention of starting a company. However, while writing my paper about AI-based creative platforms, I realized that my research was much more influential than I thought. Visual effects people, filmmakers, creators, artists, and designers responded by saying, 'This is interesting; I want to try it."
Because the actual "artist" created the service, the biggest strength of Runway is that it is "artist-oriented." The tool made by the artist was different. The CEO and the technology developers deeply understand the main inconvenience of the previous services, which did not contain generative AIs. They also worked hard to provide experiences that were as comfortable and delicate as the original video editing tools using AI-based software Runway. Runway accurately gave the users "the control as a creator" and designed the most intuitive UI for creators.
Not only did Runway concentrate on technology, but it also thought about marketing. The genuine excellence of the Runway is its precise understanding of its target. They set "artists" as their primary customers and put much effort into marketing that can attract them. They know precisely what artists love and pay attention to. For example, Runway opened the AI Film Festival (AIFF) and cooperated with huge companies such as Coca-Cola, YouTube, and NVIDIA. Its goal is to attract artists who use AI into the Runway ecosystem. AIFF thought, "The works on display give us a glimpse into a new creative era. This becomes possible when tomorrow's tools empower today's outstanding creative talent." There is also a short film contest: "Gen:48". Making a 1~4 4-minute video must take only 48 hours, including planning and editing, and the Runway must make 75% of it. The winner of the first prize in the previous year, Greenskull AI, said, "We need to use these AI tools to create new works of art. It can help and improve human work. For people like me who are creative in some ways but not in others, it makes impossible tasks possible." Also, Runway is providing the most fashionable technology magazine, "Telescope." The goal of the magazine telescope is to suggest the sight of reading about the flow of the new era. Cristóbal Valenzuela said, "To be an artist, you have to be able to look to the future while still being informed by the past. Today, we stand in a new era where AI enhances creativity. That is why having your telescope has become more important than ever. So we prepared a telescope for you. I hope this helps you imagine where we are all headed."
What kind of future will video-generating AI give us? The video-generating AI market still has far to go. Although it is not yet at a level where AI-generated videos can be commercialized, it effectively shortens the video editing process and improves the quality of roughly captured videos. The Oscar-nominated movie Everything Everywhere All at Once used Runway technology for CG. Artists first moved the stones using sticks and ropes. Afterward, they used the Runway to erase the tools from the video. A job that would typically take half a day was completed in minutes. The visual effect artist of the movie, Evan Halleck, said, “It picks things out better than the human eye. Rotoscoping has been a prolonged and painful process. It was great to be able to automate this.”
Many AI and video art experts expect that video-generating AIs will create a world where everyone can create blockbuster movies. From the very first step, the planning stage requires much work from visual effects experts. However, if you use AI, you can extract the writer's or director's thoughts in more detail, and the storyboard will come out immediately. One day, we could make an entire movie out of the Runway. Cristóbal Valenzuela illustrated the future: "There will be a time when anyone can make blockbuster movies that only a few people could make before. I call it Hollywood 2.0."
To go further, there could be a future when people are immersed in the world by just "streaming" videos created to suit them. It is the General World Model (GWM). GWM is all about creating a "world." The characters in it will have conversations, gain experiences, and fall in love. It simply captures the lives of the characters in AI. The idea is to create a world powered by AI by learning language, video, images, and audio. The Runway is also the future we dream of.
Commenti