Wild AI Video Workflow with Viggle, Kaiber, Leonardo, Midjourney, Gen-2, and MORE!

Theoretically Media
2 Apr 202411:58

TLDRIn this video, Tim shares an innovative AI filmmaking workflow that spans from pre-production to generating short films. Inspired by the 2016 film Rogue One, Tim explores the use of AI tools to create a hybrid storyboard animatic animation. He demonstrates the process using references from Gladiator and John Carter of Mars, and discusses the use of Vigle, Midjourney, and Kyber to generate and refine the AI content. Despite some challenges with camera movement and character consistency, the workflow offers a promising approach for filmmakers, providing a unique and efficient method for pre-production and short film creation.

Takeaways

  • 🎬 The speaker shares an AI filmmaking workflow with potential applications from pre-production to short film generation.
  • 🚀 Inspiration comes from the 2016 film Rogue One, specifically an interview with editor Colin Ghoul about creating a feature-length story reel without a screenplay.
  • 🌟 The workflow aims to create a hybrid storyboard animatic animation using AI tools, going beyond traditional methods.
  • 📏 The process starts with clipping reference footage and using Vigle's 2.0 update for initial video generation.
  • 👾 Midjourney is utilized to create a model for the main character, emphasizing the importance of a full-body image for accurate results.
  • 🎥 Vigle's limitations are noted, particularly with camera movement, which affects the final output's smoothness.
  • 🧠 The use of Leonardo is introduced to improve upon Vigle's output by using image-to-image references and lowering the image strength.
  • 🔄 Kyber's motion 3.0 feature is highlighted for refining the character animation and providing a unique AI video generation experience.
  • 🌁 Backgrounds are enhanced with Gen 2 and Kyber to match the character's style and create a cohesive visual effect.
  • 🎞 Video editing software like Premiere is used for compositing character and background, with tips on chroma keying and character enhancement.
  • 🎶 Audio elements, such as crowd chanting, are generated using AI tools like audiogen, and music is composed in Ableton for a complete cinematic experience.

Q & A

  • What is the main topic of the video?

    -The main topic of the video is an AI film making workflow that covers various stages from pre-production to generating short films, using a combination of different AI tools.

  • What film inspired the creation of this workflow?

    -The 2016 film Rogue One, directed by Gareth Edwards, inspired the creation of this workflow due to its historical significance as the first major film with a fully deep faked character.

  • What was unique about the production of Rogue One?

    -A unique aspect of Rogue One's production was the use of a feature-length story reel created before the script was finished, using clips from hundreds of movies to plan the dialogue and timing of the film.

  • How does the video creator plan to apply the Rogue One story breakdown technique using AI tools?

    -The video creator plans to apply the Rogue One story breakdown technique by creating a hybrid storyboard animatic animation using current AI tools, taking the idea a step further to generate or augment content.

  • What AI tools were used in the creation of the example scene?

    -The AI tools used in the creation of the example scene include Vigle, Midjourney, Leonardo, and Kyber, each utilized for different aspects such as character generation, background creation, and video editing.

  • What challenges were encountered when using Vigle for the AI-generated video?

    -Challenges encountered when using Vigle included issues with camera movement, stuttery footage, and maintaining consistency in character appearance.

  • How was the character for the AI-generated video designed?

    -The character was designed using Midjourney with a 16:9 format, ensuring full body shots from head to feet, and prompts that specified the desired appearance and attire.

  • What was the process for creating the background of the AI-generated scene?

    -The background was created by taking a screenshot of an initial frame from the reference video, using Gen 2 to add movement, and then running it through Kyber for stylization to match the character's background.

  • How were the character and background combined in the final video?

    -The character and background were combined in a video editor, with the character on the top layer and the background beneath it, using a chroma key remover to integrate them seamlessly.

  • What tools were used to add sound and music to the AI-generated video?

    -AudioGen was used to generate crowd chanting for the arena scene, and Ableton was used to create a quick 20-second cue for the soundtrack.

  • What was the outcome of using AI tools for the entire film-making process?

    -The outcome was a short film that, while not perfect, demonstrated the potential of using AI tools in the film-making process, particularly for pre-production and creating short films.

Outlines

00:00

🎬 AI Filmmaking Workflow Introduction

The speaker introduces an AI-based filmmaking workflow that has potential throughout the entire production process, from pre-production to generating short films. The workflow is a combination of various tools and techniques, inspired by the 2016 film Rogue One. The speaker aims to share their learnings and experiences, hoping to save time for others interested in trying out this workflow. The discussion begins with the inspiration behind the workflow and the speaker's personal spin on it.

05:00

🌟 Utilizing AI for Hybrid Storyboarding and Animation

The speaker discusses the process of using AI to create a hybrid storyboard animatic animation. They reference the famous scene from Gladiator and how it was recreated with AI, incorporating elements from different movies and Warhammer 40K. The workflow involves using reference footage, AI tools like Vigle for initial video generation, and refining the output with additional AI models. The speaker shares their experiences with Vigle's 2.0 update, its capabilities, and limitations, such as difficulties with camera movement and fine-tuning the character's appearance.

10:00

🛠️ Enhancing AI Generated Content with Additional Tools

The speaker explains how they enhanced the AI-generated content by using additional tools like Kyber and Leonardo. They detail the process of refining the character animation by using image references and prompts, and how they overcame issues with shaky footage by using Kyber's motion 3.0 feature. The speaker also discusses the process of creating a dynamic background with Gen 2 and incorporating it into the final video, emphasizing the importance of a cohesive look and the use of video editing tools for compositing and additional adjustments.

🎭 Post-Production and Audio Integration for AI Films

In the final paragraph, the speaker talks about the post-production process, focusing on audio integration. They describe using a free site, audiogen, for crowd chanting and another source, typcast, for dialogue generation. The speaker also discusses their experience with 11 Labs and finding an alternative model that worked better for their needs. They share their approach to creating a soundtrack using Ableton and loops, concluding that while the method may not be perfect for full feature films, it is useful for short films and pre-production work.

Mindmap

Keywords

💡AI filmmaking

AI filmmaking refers to the process of creating films or videos using artificial intelligence technologies at various stages, such as scripting, editing, and visual effects. In the video's context, the creator shares an innovative workflow that integrates AI from pre-production to generating short films, showcasing how AI can augment or fully generate visual content. This approach is highlighted as promising, with examples from various collaborators demonstrating its potential in enhancing creativity and efficiency in filmmaking.

💡Kitbash

Kitbash, in filmmaking and model building, involves taking parts from multiple sources and combining them to create something new. The video describes the AI filmmaking workflow as a bit of a 'kitbash,' implying that various AI tools and techniques are combined in novel ways to produce short films. This method allows for experimentation and creativity, pulling from an assortment of AI technologies to achieve desired outcomes.

💡Rogue One

Referenced as an inspiration for the discussed workflow, 'Rogue One' is a 2016 film by Gareth Edwards known for its innovative use of CGI and deepfake technology to recreate characters. The video highlights the film's historical significance in pioneering these techniques, which serves as a jumping-off point for exploring how current AI tools could similarly revolutionize content creation, especially in repurposing existing materials for new productions.

💡Storyboard animatic animation

This term combines elements of storyboarding and animatics with animation, suggesting a hybrid approach to visual storytelling that utilizes preliminary visuals (often rough and schematic) to plan out scenes and actions. The creator aims to take AI tools a step further by creating these hybrid visuals, enhancing the traditional storyboard and animatic process with AI-generated animation to better visualize the final film.

💡Vigle AI

Vigle AI is mentioned as a tool used in the workflow for clipping and editing reference footage. With its 2.0 update, it represents one of the AI technologies that enable creators to manipulate and generate video content, showcasing how such tools can facilitate the creation of AI-augmented or wholly AI-generated film segments. The tool's ability to integrate dance moves into the film is highlighted as an example of its versatility.

💡Midjourney

Midjourney is used in the workflow to create a model for the main character. It exemplifies an AI image generator capable of producing detailed and contextually relevant visuals from textual prompts. The video illustrates how Midjourney can contribute significantly to the pre-visualization phase of filmmaking by generating characters and scenes that match the filmmaker's vision, emphasizing the importance of detailed prompts.

💡Kyber AI

Kyber AI is noted for its unique video generation capabilities, particularly with the introduction of its motion 3.0 feature. In the workflow, Kyber AI is used to refine and stabilize AI-generated footage, showcasing its role in enhancing the quality and consistency of AI-generated visual content. Its ability to add motion and emotion to characters illustrates the advanced level of AI's involvement in the creative process.

💡Chroma key

Chroma keying is a post-production technique used to composite two images or video streams together based on color hues (chroma range). The workflow involves using a green screen background for easy removal and layering of the AI-generated character over another background. This highlights the importance of chroma keying in integrating AI-generated elements seamlessly into live-action or other backgrounds, crucial for the final compositing stage.

💡Leonardo AI

Mentioned as part of the solution for addressing issues with generated content, Leonardo AI is used for image-to-image translation, helping to refine and correct visual elements. This tool exemplifies the use of AI for fine-tuning and adjusting generated imagery to better fit the narrative or visual aesthetic of the film, showcasing AI's role in the iterative creative process.

💡Audio generation

The process of creating sound effects, dialogues, and music using AI technologies. In the workflow, audio generation tools like AudioGen and 11Labs are utilized to produce crowd chanting and dialogue, demonstrating AI's capacity to not only generate visual content but also to enhance or create auditory elements of a film. This highlights the comprehensive potential of AI in filmmaking, covering both visual and auditory aspects of production.

Highlights

AI film making workflow is presented, covering pre-production to generating short films.

The inspiration comes from the 2016 film Rogue One, which featured the first major film with a fully deep faked character.

Editor Colin Ghoul's 2017 interview discussed creating a feature-length story reel before the script was finished, using hundreds of movies.

The idea is to create a hybrid storyboard animatic animation using AI tools.

The 'Are you not entertained' scene from Gladiator is used as an example, with elements from John Carter of Mars and Warhammer 40K.

The process involves clipping reference footage, using Vigle 2.0 for initial AI video generation, and refining with Midjourney for character design.

Vigle 2.0 is noted for its challenges with camera movement, leading to stuttery results.

Leonardo is used to enhance character images with specific prompts and image-to-image references.

Kyber is introduced as a unique AI video generator, particularly with its new motion 3.0 feature.

The output from Vigle is further processed in Kyber for additional stylization and consistency.

Backgrounds are created with movement and life using Gen 2 and then integrated with Kyber for a cohesive look.

Video editing is done in Premiere, with chroma key removal and color correction to blend character and background.

The method's effectiveness for full feature films is debatable, but it shows promise for short films and pre-production.

The workflow is considered more useful and productive than simply compiling existing films for pre-production.

The presenter, Tim, is planning more workflow videos for those interested in this AI film making technique.