Home Artists Posts Import Register

Content

Hello everyone!

This is a continuation from looking back at past works vol.1.

- 2018 -

- We tried to see if we could create emotional expressions by switching facial expressions while moving the character in real time.


- Improved Anime expression using Unity. Even though the character model data was the same, the appearance could be changed by adjusting post-effects and other screen effects.


- I made a prototype to see if I could move the legs in a good way using only three-point tracking. I added controls such as moving the pelvis in accordance with the movement of the head.


- In pursuit of automatic facial expression control, I tried to enrich the emotion by including a forceful eye closing motion instead of simply closing the eyes. As I recall, it was linked to the up-and-down movement of the head.


- I created a mechanism to change the way the character is shadowed and the flare screen effect dynamically changes according to the position of the light. The shadows were made to change under Unity's control rather than changing the model's normals.


- This is the one where you tried to see if you could make a video in PlayAniMaker while using the technology so far. The background is made with After Effects.

The video is uploaded on YouTube.

https://www.youtube.com/watch?v=NwlO4bA2W4s

I was experimenting and making other visual expressions. ↓


- Here is one that I was testing to see if I could create an animated scene using PlayAniMaker figure function. I filmed only the characters and combined them with the background using After Effects. It's also fun to record the production process as a making-of video.


- This is an experiment in a function that automatically adjusts the color tone of the character to match the background. Instead of simply riding on the color of the background, we adjusted the color so that it changes subtly according to the lightness or darkness of the background.

This is a technique that will be used later in Shoost.


- 2019 -

- The hair is made to move by PlayAniMaker and combined with the background by After Effects.

- I just wanted to make a video of the way the face is shadowed.


- VRM format and Vroid here we come! From here, we created a mechanism to bring in any character you like, and the Auto Color feature allows the character's color tones to change automatically.


- I tried to create a system that would easily change textures, but I thought it was not good to change the style without permission, so I rejected the idea.


- This is a study of expression to bring characters to life. I adjusted the creation of blinks to eliminate the sense of discomfort when viewed in close-up. I tried to create a control that automatically moves the eyeballs in accordance with the blink movement.


- We have made the character to be affected by the wind. Instead of simply being affected by the wind, each hair has a different wind force value. We also created a mechanism that automatically sets up the hair so that when the model is loaded, the wind will easily make the hair flutter in the wind.


- This is also a study of expressions that bring characters to life. In many cases, we see animations of 3DCG characters with no shoulder movement, so I made this to see if it is possible to move them easily by using system control. I tried to make the shoulder move automatically in conjunction with the arm movement.


- This is an experiment to move the whole body using the Oculus Rift and hand controller. Developing from the shoulder movement described earlier, I made it so that not only the shoulders but also the hips and other parts move with the control. It's like I systematized each part to work together based on the range of motion of the bones.


- I further developed it so that even the legs move in tandem. At about the angle of a bust shot, the movement of the shoulders makes them look dynamic.


- This is the one I tried to see if I could create an animated character with Vroid.


- I need to know what kind of functionality is necessary to create a system control, so I made a motion for this one, including the meaning of practicing facial expressions.


- I decided to create something else instead of just making characters. I made it with particles in Unity, imagining the beautiful rain that you see in anime.

- I tried to express the moisture in the eyes by switching about 3 textures to express the moisture. I added a gradient to the edge of the whites of the eyes to create a subtle roundness of the eyeballs, and adjusted the look to withstand close-up shots.

- I connected the scenes I created using Unity Timeline. I was experimenting with adding sounds and other elements on the Timeline to see how far I could make it in real time.


- When 3DCG characters are used to create anime expressions, the shape of the face is changed to make it look better to the camera. I created this control to make this adjustment automatically.

I tried to make the mesh of the face change automatically according to the camera position.


- Here is a demo reel of the work we have made so far.

You can see it on YouTube.

https://www.youtube.com/watch?v=YGZB8-2Kazk


Vol. 2 ends here!

Vol. 2 is now a summary through 2018 - 2019. To be continued in Vol. 3

Files

Comments

mayshing

absolutely awesome development. :D