Civitai AI Video & Animation // Motion Brush Img2Vid Workflow! w/ Tyler

Civitai
18 Apr 202464:46

TLDRIn this engaging live stream, Tyler from Civitai AI Video & Animation shares an exciting workflow that utilizes a motion brush in Comfy UI with anime diff to animate specific parts of images. The session includes a step-by-step guide on how to bring images to life with this technique, starting from the initial image selection to the final animation output. Tyler emphasizes the importance of choosing the right motion layers and provides tips for fine-tuning the animation to achieve the desired effects. He also discusses the workflow's low VRAM usage, making it accessible for users with lower-end hardware. The stream is interactive, with audience members submitting images to be animated, and Tyler showcases the process of painting key parts for animation. The workflow, originally created by VK, is credited and shared with permission, highlighting the collaborative spirit of the AI art community. Tyler also mentions upcoming guest creator streams, including one with Noah Miller, discussing the evolution of AI in animation and the challenges faced by creators in the field.

Takeaways

  • 🎨 Tyler introduces a workflow for animating images using a motion brush in Comfy UI with anime diff, allowing specific parts of images to come to life.
  • 🖼️ The starting image for the animation is shown, with the final animated version demonstrating the dripping effect achieved through the workflow.
  • 📉 Tyler mentions that they will only be working with low resolution for speed, but upscaling can clean up artifacts and blurriness.
  • 🌟 A successful output is shared where clouds, hair, and reflective parts of an outfit were animated to give the impression of wind blowing.
  • 🔍 The workflow can be finicky, requiring iteration and finding the right motion layers for good results.
  • 🎓 The workflow was created by VK, who has given permission for Tyler to share it with the audience.
  • 👾 The workflow is low VRAM friendly, making it accessible for users with lower-end graphics cards.
  • 📱 Tyler discusses the use of different checkpoints for anime-based animations and the importance of selecting the right one.
  • 🎥 The animation frame count is set to 60 frames, providing a good duration for the generated animations.
  • 🖥️ Tyler shares his personal experience with different models and why he has moved away from the Hello 2D model.
  • 📸 Audience members are encouraged to submit their images for animation, and several images are selected and animated during the stream.
  • 🌟 The final output of the animations are showcased, demonstrating the potential of the workflow for creating dynamic and engaging content.

Q & A

  • What is the main topic of the video?

    -The main topic of the video is about demonstrating a workflow that revolves around using a motion brush in comfy UI with anime diff to animate specific parts of images.

  • Who is the host of the stream?

    -The host of the stream is Tyler.

  • What is the purpose of using the motion brush in the workflow?

    -The purpose of using the motion brush is to bring specific parts of images to life, creating an animated effect.

  • What is the significance of the 'control net' in the workflow?

    -The 'control net' plays a crucial role in smoothing out animations and tempering saturation, which helps in achieving more natural and less fragmented motion effects.

  • What is the role of the 'motion Laura' in the workflow?

    -The 'motion Laura' is used to control the type of motion that is applied to the animated parts of the image, affecting how the motion appears and feels in the final output.

  • Why is the 'VK motion brush' workflow considered low VRAM friendly?

    -The 'VK motion brush' workflow is considered low VRAM friendly because it allows for the creation of animations with relatively low graphics processing unit (GPU) memory usage, making it suitable for users with lower-end hardware.

  • What is the importance of the 'frame rate' setting in the workflow?

    -The 'frame rate' setting determines how many frames per second the animation will have. It affects the smoothness and speed of the animation, with higher frame rates generally resulting in smoother motion.

  • What is the purpose of the 'Interpolation' node in the workflow?

    -The 'Interpolation' node is used to smooth out the transitions between frames, creating a more fluid and less choppy animation.

  • What does the 'grow mask with blur' node do in the workflow?

    -The 'grow mask with blur' node expands the mask outside of where it was painted and blurs it, creating a more natural fall off in the motion, making the animation less sharp and fragmented.

  • Why is it recommended to use a high-quality image in the workflow?

    -High-quality images result in better output because they contain more detail, which allows the motion brush to create more accurate and smooth animations. Low-resolution images can lead to distortion and artifacts in the final animation.

  • What is the significance of the 'upscaling' process mentioned in the video?

    -Upscaling is a process that increases the resolution of an image or animation. It is significant because it can help clean up artifacts and blurriness from the output, resulting in a clearer and more polished final product.

  • How does the choice of the 'checkpoint' affect the animation?

    -The choice of the 'checkpoint' in the workflow determines the style of the animation. Different checkpoints, like 'Boton LCM' and 'Every Journey LCM', are optimized for different types of animations and can significantly affect the final look and feel of the motion.

Outlines

00:00

🎨 Introduction to the AI Video and Animation Stream

Tyler, the host, welcomes viewers to the AI video and animation stream. He expresses excitement about sharing a new workflow involving a motion brush in Comfy UI with an anime diffusion model to animate parts of images. He invites viewers from Discord to submit images for animation, mentions a previous stream with Spencer, and shows an example of an animated dripping head image.

05:02

📈 Workflow Details and Community Contributions

Tyler explains the workflow, emphasizing the importance of having the correct clip Vision and IP adapter models. He discusses the use of the IP adapter Advanced node, Laura loader, and ControlNet for smoother animations. He also provides a link to download the ControlNet and mentions using two different checkpoints for anime-style animations. The frame count and EMA settings are covered, along with the process of adding images to the workflow.

10:03

🖼️ Image Selection and Anime Diffusion Preferences

The host talks about his reasons for switching from the popular Hello 2D model to keep content fresh. He demonstrates how to use the mask editor to select parts of an image for animation, showing how to paint the eyes, eyelids, and other features for more pronounced animation effects. He also mentions the importance of choosing the right motion layers and shares his plans to upload the workflow after getting permission.

15:04

🎭 Adjusting Animation and Masking Techniques

Tyler demonstrates how to adjust the mask for animation using nodes like 'grow mask with blur' to create a smooth fall-off effect. He also discusses the possibility of inverting the mask for reverse animation and tweaking values to control brightness in masked and unmasked areas. The paragraph ends with a preview of the animated character's eyes, which received positive feedback from the audience.

20:05

🤖 Low VRAM Workflow Efficiency

The host talks about the efficiency of the workflow, which is friendly for users with low VRAM. He shows the VRAM usage while generating 60 frames with interpolation and compares it to the usage without interpolation. Tyler also shares his Civitai profile for reference and moves on to the next image, demonstrating the process of painting and animating different elements like fire, smoke, and hair.

25:07

🎨 Painting and Animating Diverse Images

Tyler discusses the process of painting various elements in the images for animation, such as the hand with an eye and other anime-style features. He experiments with different motion layers and checkpoints to achieve the desired animation effects. The audience is shown examples of the animations, including a blinking character and dripping effects, which provoke laughter and excitement.

30:08

🌊 Experimenting with Motion and Animation

The host shares his excitement about experimenting with different motion layers and prompts to drive the animation. He emphasizes the importance of using high-quality images to avoid distortion. Tyler also discusses the contribution of his friend Palm, who trained a motion layer used in the stream, and mentions Palm's upcoming appearance to showcase a special project.

35:08

📚 Finalizing the Workflow and Upcoming Streams

Tyler wraps up the stream by showing how to finalize and export the workflow. He mentions his equipment preferences, specifically the XP-Pen Pro 24 for pen input. The host also previews the next day's guest, Noah Miller, and outlines the topics they will discuss, including the evolution of AI in animation and their experiences as AI video creators. He thanks the viewers for their participation and teases the next month's lineup of guest creators.

40:09

🌟 Showcasing the Workflow and Community Engagement

The final paragraph focuses on making the workflow available to the viewers, with Tyler providing a link to download it. He encourages the community to use the workflow to create content and share their work on Instagram with a specific hashtag. Tyler also mentions an Instagram post featuring a song made in Udio and invites the community to engage with it. He concludes the stream with a reminder of the next day's schedule and expresses gratitude to the viewers and the workflow creator, VK.

Mindmap

Keywords

💡Motion Brush

Motion Brush is a tool used within the video to animate specific parts of images. It is central to the workflow shared by Tyler, allowing users to bring selected areas of their images to life with movement. In the script, Tyler demonstrates how to use the Motion Brush with different 'motion layers' to create various animations, such as dripping effects or blinking eyes.

💡Anime Diff

Anime Diff refers to a style of animation that mimics the aesthetic of anime, a form of Japanese animation. In the context of the video, Tyler uses the term to describe the type of animation being created, which is characterized by its cartoonish and expressive style. Anime Diff is used in conjunction with the Motion Brush to achieve the desired animated effects.

💡Comfortable UI (Comfy UI)

Comfy UI is a term used to describe a user interface that is easy and pleasant to use. Tyler mentions using a Motion Brush within a Comfy UI, indicating that the software they are using has a user-friendly interface that facilitates the animation process. The Comfy UI allows for efficient navigation through the workflow.

💡IP Adapter

The IP Adapter is a component in the workflow that Tyler discusses. It is used to connect different parts of the animation process and ensure that the models used for the animation are correctly integrated. Tyler specifies using the 'IP Adapter Advanced node' and 'IP Adapter Plus 1.5 model' in the workflow.

💡ControlNet

ControlNet is a tool that helps in smoothing out animations and managing saturation. Tyler mentions using the 'ControlNet animate diff' within the workflow to enhance the quality of the animations. It is set to a strength of 0.5, indicating that it plays a significant role in refining the animated output.

💡Checkpoints

Checkpoints in the video refer to specific models or stages in the animation process that are used to achieve different effects. Tyler discusses using 'Boton LCM' and 'Every Journey LCM' checkpoints for their respective benefits in animating anime-style images and maintaining a cartoonish look.

💡VRAM

VRAM, or Video RAM, is the memory used by graphics cards to store image data for rendering. Tyler talks about the workflow being 'low VRAM friendly,' meaning it doesn't require a lot of graphics memory, making it accessible for users with lower-end graphics cards.

💡Interpolation

Interpolation is a method used to create smooth transitions between frames in an animation. Tyler mentions using a frame rate of 15, which is then doubled through interpolation to achieve 30 frames per second, resulting in a smoother animation.

💡Mask Editor

The Mask Editor is a tool used to selectively edit parts of an image. In the video, Tyler uses the Mask Editor to paint over areas of the image that should be animated, such as the eyes or hair, allowing for precise control over which parts of the image receive the animation effect.

💡Upscaling

Upscaling is the process of increasing the resolution of an image or video. Tyler discusses the use of an 'upscale and upscaler' in the workflow to improve the quality of the animated output by reducing artifacts and blurriness from the animations.

💡VK

VK is the username of the creator of the workflow that Tyler is sharing. VK is credited with creating the workflow and giving Tyler permission to share it with the audience. VK's work is influential in the community, and Tyler encourages viewers to follow VK on Instagram for more content.

Highlights

Tyler shares a new workflow for animating images using a motion brush in Comfy UI with anime diff.

The process can bring specific parts of images to life, such as making them appear as if blowing in the wind.

The workflow is low VRAM friendly, making it accessible for users with lower-end graphics cards.

Tyler received permission from VK, the creator of the workflow, to share it with the community.

The starting image is transformed into a dripping effect using the workflow.

Upscaling the output can clean up artifacts and blurriness from the generated animations.

The use of a control net called 'control GIF animate diff' is crucial for smoothing out animations.

Two checkpoints, Boton LCM and Every Journey LCM, are used for anime-based animations.

The motion brush allows users to select which parts of the image to animate by painting over them.

Different motion layers can be used to achieve various animation effects.

The 'grow mask with blur' node helps to create a smooth fall off in the motion areas.

The entire workflow generates 60 frames while only using approximately 9 GB of VRAM.

The workflow is demonstrated on various images, including a character with a dripping head, a girl in front of a flaming house, and a monster.

The 'wave pulse' motion layer, trained by Palm, is used to animate a surfing image.

The XP pen Pro 24 is recommended for users who prefer pen input for their graphics work.

The workflow will be uploaded to Civitai for community use after the stream.

Tyler encourages viewers to experiment with different parts of the workflow to create unique animations.