Home Artists Posts Import Register

Content

Okay. Updates first:
I'm going to make Yua's story this week. I've skipped last week for some reasons and being very occupied.
Next posts are AOT (Historia Reiss and Sasha Braus)
Then it's Onepiece and Naruto.
I'm also going to publish some other samples from patron requests.
That's when the poll results starts to come.

Now about the trainings:

There is a new training optimizer that people are starting to use which is named Prodigy it seems to help a lot to have very good results with low effort. I basically lost half of my day trying to make it work well for me but the results were quite lame.
I'm probably doing something wrong and I noticed that Regularization images makes it look too much "dreambooth style" when using it.
I'm still going to make my models with a mix of my old method and some knowledge from other AI trainers around the web. At least I'm learning something new every time I have to train models.
This model was way better (like it is now) trained with my method instead of the Prodigy. I'm using AdamW8Bit yet.
I think that for styles is better if I try to squeeze as much steps as I can without burning the model.
These images are VERY basic and are from a requested model that I did using my method. They don't have any Textual Inversion or complex negative prompts. I'm also using a very light model (AnythingV3) and it's looking very satisfactory with all that said.
I also noticed that making a model with 128 Network Dimension tends to burn faster. That's the reason my old models with 8 Net had strong style without burning.

Finally, I didn't make any tutorials for training because it's going to get old too fast. But if you still want it I can make a simple one to get you started.

By the way, I take more time to train models usually than it is to make any request. Sometimes it's average/ok when everything works from the first time, but that's not so common lol. So if you want a model trained, wait patiently, please.

Files

Comments

No comments found for this post.