Home Artists Posts Import Register

Content

Hey everyone.

We're working on 2D improvements this month. Thought we'd outline some of the stuff that we'll be looking at doing and what will (hopefully) be in the next demo.


__________________________________

The 2D characters of Virtual Novel currently have very basic look-at AI, and can only change their expression while talking to you with the ADV. We want to ultimately expand that to a more in-depth and convincing AI system, which allows for physical contact and player actions to contribute to the 2Ds' mood and facial expression

To make this happen, we're doing a couple things. First this month, we're doing an entire face system overhaul. Current 2Ds have most of their facial features as flat stickers on a head mesh, which do fancy texture driven things to move the eyes and stuff. 

We're rebuilding the 2Ds to be shameless copies I M-MEAN inspired by MMD models, with fully modelled eyes, eyebrows, eyelashes etc, and more physical eye-aiming. There's a clip on Virtual Novel's Twitter demonstrating the improved eyes inside Maya (which look good / correct even at quite extreme angles which the texture based solution failed to do) and shows off the eye aim.

Here's some screens that hopefully demonstrates the improvement to the way the eyes look at varying angles, while keeping the 2D look quite accurately. 

Current 2D eyes:

Improved 2D eyes:


As well as being a visual improvement, this lets us control a huge amount of facial expressions with blendshapes, and also will eventually allow for a huge amount of customization in Fitting Room. This also allows for soft blinks and very subtle changes in expression, since transitions between blendshapes are always smooth. 

We'll also be updating the specular system with more specular highlight shapes for variation as well as a new specular color / tint option among the VN NPCs and for player's creations in Fitting Room, and we've added a specular wobble effect for crying / wet eyes, which you can check out a clip of here.




Here's a couple hi-res closeups of the new improved, and fully 3D eyes (forgive the default / expressionless face, we have yet to implement the emotions system for the new head)


Since the distance you would usually talk to a 2D is where your stereo vision happens to be at its most effective, and the eyes are now physically layered with the iris, pupils, specular, shadows, eyelashes - this also means the 2D's eyes are now quite exquisitely deep and immersive in VR. This overhaul should be completed by the end of the month for you guys to check out in a demo build.


__________________________________ 

Second, we've begun planning and building our personality and mood system to drive the new facial emotions. It will work by a multilayered stats system which then drives the NPC's mood and facial expressions, and reactions to player actions. An NPC's base personality stats like positivity, extroversion / shyness, along with their current mood stats, as well as their affection stat for the Player, will be queried during certain physical prompts (such as "head patting", whatever that is). 

Different stats will determine how the 2D will react to this unwelcome invasion of her personal space, such as if they're in a good mood and / or if you have a certain familiarity level with them. These branching If Statements will be used during encounters with 2Ds to drive their reactions for a certain amount of dynamicisim and unpredictability. 

For random NPCs, this will mostly just be used to drive their reactions, but for story characters, their mood and affection stats may be more integral to your relationship. If you do something they don't like, their opinion of you might be affected. We don't yet know how complex this will be, so we're unsure of the timeframe but will hopefully have some early implementation this month.

Third, we're working on an inverse-kinematics system for hand-holding or placing limbs. There's a brief clip here showing the current, quite basic implementation. This will be useful for a couple things we'd like to implement - the posing of characters like a doll / mannekin in Fitting Room, and grabbing character's hands or them grabbing your hands / touching you and other objects in a dynamic way.

For people not familiar with IK, it just means that instead of hand-animating a character to pick something up or touch something, an in-game target location drives their animation instead, dynamically. So a 2D can "headpat" you since your VR headset location acts as a target for their hand to reach to. The implications of an IK system is pretty far reaching, so hopefully once we have it implemented we can use it for a whole lot of stuff.


__________________________________ 


The next playable build will hopefully contain examples of all of the stuff mentioned here, but programming is always a little less predictable than environment creation so we don't have a promised date. Hopefully we'll have something for end-of-month though.

Thanks for the support everyone!

Love from the VN Dev teem

Files

Comments

No comments found for this post.