The Project

The majority of challenge and enjoyment for the player when playing a stealth game comes from the believability of the enemy AI systems; the way they react to the player, allies around them, and a changing game environment. The overall aim of this project is to create AI agents for stealth games that seem believable to the player this may sound like an AI agent should be smart however if an AI agent feels dumb the player will not get any satisfaction from beating them. However, not being dumb is not the same as being smart, the Ais actions must feel plausible to the player. This means that for the AI to be believable it also has to feel fair to the player. The stealth game genre was chosen for this report as it requires AI to not know about the player, instead they have to be aware of what is going on around them in order to be believable. The main reason this is an important area for research is that in games the majority of fun and enjoyment for the player comes from overcoming challenging situations. In stealth games these situations are primarily based around how the AI reacts to the player and the game environment.

Project Specifics

Built in C# using Unity 2021

Built in 6 months. January 2022 - May 2022.

Features

  • View cone model for sight that is similar to how human eyes work
  • Different visual detection values based on the size of limbs on the player. E.g. the player's chest is more visible than the arm
  • A optimised approach to the visual perception based on a maximum amount of raycasts that can occur per frame
  • Auditory perception model based on a pathfind distance between sound origins and the perceiver.
  • Environmental awareness systems that allowed the AI to respond to a changing game environment
  • Social interactions between the AI where they search in groups and will notice if an AI goes missing from their group.
  • Custom behaviour tree editor and system
  • Systems for creating dialogue sets for AI to create unique AI each time (if given enough dialogue samples)
  • A diagram of how auditory perception systems usually function

    A diagram of how the auditory perception system functions in this project

    The Completed Behaviour Tree for this Project

    The Behaviour Tree Editor created for this project

    The View Cone Model created for use in this project

    What went well

  • Implementation of the Visual and Auditory Perception Systems
  • Environmental Awareness Systems
  • AI making their intentions known to the player
  • What didn't go well

  • Social interactions could’ve been a lot smoother and more believable
  • The initial adjustment to using behaviour trees took more thought than expected.
  • What could be improved/added

  • Different engine/framework
  • More social interactions
  • Personalities for each AI agent
  • Senses affected by the game environment
  • Use of a different framework/game engine. Unreal engine would’ve been a better choice for this project due to systems like behaviour trees already being in place.
  • What I Learned

  • How to correctly scope a project
  • Creating advanced gameplay mechanics from thought, to design, to implementation
  • How games create believable AI systems
  • The importance of non-programmatic polish to make AI seem believable.
  • Profiling and optimisation of computationally demanding mechanics/features
  • Gameplay Demo

    Technical Demo