Home Artists Posts Import Register

Downloads

Content

Sort of a disclaimer: don't dive headfirst into a nightly build if you're planning to use it for your currect project, which is already past its deadline - you'll have a bad day. This is not a production-ready user-friendly software :D  ATTENTION! 16GB VRAM NVIDIA GPU is required for local run, or colab pro with a high ram env. You may be able to run it on a smaller amount of VRAM, but I haven't tested it and can not confirm how low you'll have to go with your settings, so please join discord and ask around before subscribing.

Happy Easter (with a delay) :D

This release features Loras support, Face controlnet, and some other quality-of-life tweaks.

Changelog:

  • add lora support
  • add loras schedule parsing from prompts
  • add custom path for loras
  • add custom path from embeddings
  • add torch built-in raft implementation
  • fix threads error on video export
  • disable guidance for lora
  • add compile option for raft @ torch v2 (a100 only)
  • force torch downgrade for T4 GPU on colab
  • add faces controlnet from https://huggingface.co/CrucibleAI/ControlNetMediaPipeFace
  • make gui not reset on run cell (there is still javascript delay before input is saved)
  • add custom download folder for controlnets

upd:

  • fix face controlnet download url
  • fix controlnet depth_init for cond_video with no preprocess
  • inherit AGPL license from AUTOMATIC1111

Detailed changelog: https://www.patreon.com/posts/81495865

A reminder:

Changes in GUI will not be saved into the notebook, but if you run the diffuse! cell with new settings, they will be saved to a settings.txt file as usual.

You can load settings in misc tab.

You do not need to rerun the GUI cell after changing its settings, but you can now :D

Local install guide: https://discord.com/channels/973802253204996116/1067887206125015153/1067888756910215278

https://github.com/Sxela/WarpFusion/blob/main/README.md

Youtube playlist with settings: https://www.youtube.com/watch?v=wvvcWm4Snmc&list=PL2cEnissQhlCUgjnGrdvYMwUaDkGemLGq

For tech support and other questions please join our discord server: https://discord.gg/YrpJRgVcax

Discord is the preferred method, because it is nearly impossible to provide any decent help or tech support via Patreon due its limited text formatting and inability to add screenshots or videos to comments or DMs. Error reports in comments will be deleted and reposted in discord.

Comments

Aipedia

They plan to release a new version that includes Loras and more user-friendliness, I think the current complexity of the model is a big commercial barrier for more people to access this content.

Diego Mellado Alarcon

hello, I have VRam problems, have 3080, what can i do?? pls help

sxela

Hi, I'd suggest joining discord, there are people who successfully ran warp on 8gb cards