Home Artists Posts Import Register

Content

AI can now hear things.  This is the start of putting stealth mechanics into the game.  For testing right now, human characters trigger a noise event when they run or walk.  AI will perceive this noise event and if they aren't already doing something they'll walk over to check out the noise.  AI will probably get confused if there are multiple sources of sound right now since I haven't put any logic in to differentiate targets and sound sources yet.  

The beginning of the sneak skill is also working if the player crouches (ctrl key to toggle) and moves they won't make any noise.  AI will still be able to see the player if the player happens to be in their cone of vision.  

The next steps will be testing noise events from things like projectiles and environment objects, probably adding a skill modifier on top of the crouching that will modify the player's chances of being detected, possibly a "detection level" and ui similar to Skyrim's opening and closing eye on the crosshair system.  

The screenshot is the in game debug visualization for the perception system.  The values are all placeholder but the yellow circle is hearing radius, the green circle is the visual radius, and the pink is where an already detected stimulus will be lost.  The green squares on the ground are the chunks of the navmesh that the AI is currently using to pathfind to their target.  The yellow sphere is the last location of a noise event that the AI detected and the green spheres are the sight events.  


Files

Comments

E i G H T

Actually awesome! Learned a lot from this explanation.