Wild AI Video Workflow with Viggle, Kaiber, Leonardo, Midjourney, Gen-2, and MORE!
TLDRIn this video, Tim shares an innovative AI filmmaking workflow that spans from pre-production to generating short films. Inspired by the 2016 film Rogue One, Tim explores the use of AI tools to create a hybrid storyboard animatic animation. He demonstrates the process using references from Gladiator and John Carter of Mars, and discusses the use of Vigle, Midjourney, and Kyber to generate and refine the AI content. Despite some challenges with camera movement and character consistency, the workflow offers a promising approach for filmmakers, providing a unique and efficient method for pre-production and short film creation.
Takeaways
- 🎬 The speaker shares an AI filmmaking workflow with potential applications from pre-production to short film generation.
- 🚀 Inspiration comes from the 2016 film Rogue One, specifically an interview with editor Colin Ghoul about creating a feature-length story reel without a screenplay.
- 🌟 The workflow aims to create a hybrid storyboard animatic animation using AI tools, going beyond traditional methods.
- 📏 The process starts with clipping reference footage and using Vigle's 2.0 update for initial video generation.
- 👾 Midjourney is utilized to create a model for the main character, emphasizing the importance of a full-body image for accurate results.
- 🎥 Vigle's limitations are noted, particularly with camera movement, which affects the final output's smoothness.
- 🧠 The use of Leonardo is introduced to improve upon Vigle's output by using image-to-image references and lowering the image strength.
- 🔄 Kyber's motion 3.0 feature is highlighted for refining the character animation and providing a unique AI video generation experience.
- 🌁 Backgrounds are enhanced with Gen 2 and Kyber to match the character's style and create a cohesive visual effect.
- 🎞 Video editing software like Premiere is used for compositing character and background, with tips on chroma keying and character enhancement.
- 🎶 Audio elements, such as crowd chanting, are generated using AI tools like audiogen, and music is composed in Ableton for a complete cinematic experience.
Q & A
What is the main topic of the video?
-The main topic of the video is an AI film making workflow that covers various stages from pre-production to generating short films, using a combination of different AI tools.
What film inspired the creation of this workflow?
-The 2016 film Rogue One, directed by Gareth Edwards, inspired the creation of this workflow due to its historical significance as the first major film with a fully deep faked character.
What was unique about the production of Rogue One?
-A unique aspect of Rogue One's production was the use of a feature-length story reel created before the script was finished, using clips from hundreds of movies to plan the dialogue and timing of the film.
How does the video creator plan to apply the Rogue One story breakdown technique using AI tools?
-The video creator plans to apply the Rogue One story breakdown technique by creating a hybrid storyboard animatic animation using current AI tools, taking the idea a step further to generate or augment content.
What AI tools were used in the creation of the example scene?
-The AI tools used in the creation of the example scene include Vigle, Midjourney, Leonardo, and Kyber, each utilized for different aspects such as character generation, background creation, and video editing.
What challenges were encountered when using Vigle for the AI-generated video?
-Challenges encountered when using Vigle included issues with camera movement, stuttery footage, and maintaining consistency in character appearance.
How was the character for the AI-generated video designed?
-The character was designed using Midjourney with a 16:9 format, ensuring full body shots from head to feet, and prompts that specified the desired appearance and attire.
What was the process for creating the background of the AI-generated scene?
-The background was created by taking a screenshot of an initial frame from the reference video, using Gen 2 to add movement, and then running it through Kyber for stylization to match the character's background.
How were the character and background combined in the final video?
-The character and background were combined in a video editor, with the character on the top layer and the background beneath it, using a chroma key remover to integrate them seamlessly.
What tools were used to add sound and music to the AI-generated video?
-AudioGen was used to generate crowd chanting for the arena scene, and Ableton was used to create a quick 20-second cue for the soundtrack.
What was the outcome of using AI tools for the entire film-making process?
-The outcome was a short film that, while not perfect, demonstrated the potential of using AI tools in the film-making process, particularly for pre-production and creating short films.
Outlines
🎬 AI Filmmaking Workflow Introduction
The speaker introduces an AI-based filmmaking workflow that has potential throughout the entire production process, from pre-production to generating short films. The workflow is a combination of various tools and techniques, inspired by the 2016 film Rogue One. The speaker aims to share their learnings and experiences, hoping to save time for others interested in trying out this workflow. The discussion begins with the inspiration behind the workflow and the speaker's personal spin on it.
🌟 Utilizing AI for Hybrid Storyboarding and Animation
The speaker discusses the process of using AI to create a hybrid storyboard animatic animation. They reference the famous scene from Gladiator and how it was recreated with AI, incorporating elements from different movies and Warhammer 40K. The workflow involves using reference footage, AI tools like Vigle for initial video generation, and refining the output with additional AI models. The speaker shares their experiences with Vigle's 2.0 update, its capabilities, and limitations, such as difficulties with camera movement and fine-tuning the character's appearance.
🛠️ Enhancing AI Generated Content with Additional Tools
The speaker explains how they enhanced the AI-generated content by using additional tools like Kyber and Leonardo. They detail the process of refining the character animation by using image references and prompts, and how they overcame issues with shaky footage by using Kyber's motion 3.0 feature. The speaker also discusses the process of creating a dynamic background with Gen 2 and incorporating it into the final video, emphasizing the importance of a cohesive look and the use of video editing tools for compositing and additional adjustments.
🎭 Post-Production and Audio Integration for AI Films
In the final paragraph, the speaker talks about the post-production process, focusing on audio integration. They describe using a free site, audiogen, for crowd chanting and another source, typcast, for dialogue generation. The speaker also discusses their experience with 11 Labs and finding an alternative model that worked better for their needs. They share their approach to creating a soundtrack using Ableton and loops, concluding that while the method may not be perfect for full feature films, it is useful for short films and pre-production work.
Mindmap
Keywords
💡AI filmmaking
💡Kitbash
💡Rogue One
💡Storyboard animatic animation
💡Vigle AI
💡Midjourney
💡Kyber AI
💡Chroma key
💡Leonardo AI
💡Audio generation
Highlights
AI film making workflow is presented, covering pre-production to generating short films.
The inspiration comes from the 2016 film Rogue One, which featured the first major film with a fully deep faked character.
Editor Colin Ghoul's 2017 interview discussed creating a feature-length story reel before the script was finished, using hundreds of movies.
The idea is to create a hybrid storyboard animatic animation using AI tools.
The 'Are you not entertained' scene from Gladiator is used as an example, with elements from John Carter of Mars and Warhammer 40K.
The process involves clipping reference footage, using Vigle 2.0 for initial AI video generation, and refining with Midjourney for character design.
Vigle 2.0 is noted for its challenges with camera movement, leading to stuttery results.
Leonardo is used to enhance character images with specific prompts and image-to-image references.
Kyber is introduced as a unique AI video generator, particularly with its new motion 3.0 feature.
The output from Vigle is further processed in Kyber for additional stylization and consistency.
Backgrounds are created with movement and life using Gen 2 and then integrated with Kyber for a cohesive look.
Video editing is done in Premiere, with chroma key removal and color correction to blend character and background.
The method's effectiveness for full feature films is debatable, but it shows promise for short films and pre-production.
The workflow is considered more useful and productive than simply compiling existing films for pre-production.
The presenter, Tim, is planning more workflow videos for those interested in this AI film making technique.