User Interfaces in Video Games 9/10

User Interfaces in Video GamesThe quest for genre-appropriate and usable game UI

To continue with the accessibility topic of my last blog post, in this one I would like to dive deeper into how this complex interaction of playing video games is for people with disabilities.

Beyond the screen and graphical user interface, we have to consider the physical interface players control the game with. While reading the Saunders and Novak book Game Development Essentials: Game Interface Design, I was really moved by the story of Robert Florio, a quadriplegic artist. He uses a “mouth stick” to play games like Devil May Cry 3, a fast paced action game with complex combos [1]. It made me realise that an accessible interface isn’t just about ease of use, but it’s also about giving someone control over a world they can’t physically interact with anymore. When a designer adds the option of remappable buttons, they aren’t just making a “setting”, they’re opening doors for people who wouldn’t be able to interact with the product at all otherwise.

The “mouth stick” in question was an early model of the QuadStick, pictured on Figure 1. This is a mouth-operated controller produced by an independent manufacturer. It acts as an “add-on” to existing consoles or PCs, using sip-and-puff sensors to translate breath and lip movements into complex game inputs [2].

Figure 1:
Quadstick
Source: [3]

A major sign that the industry is finally taking this seriously is that console manufacturers are now building these solutions themselves. The PlayStation Access Controller is a modular kit designed specifically to be accessible out of the box. It moves away from the “fixed” shape of a standard controller, allowing players to create a layout that works for their specific hand strength or range of motion. This further emphasises the importance of customisable and remappable inputs in games.

Figure 2: Playstation 5 Access Controller
Source: [4]

This is where one really sees how interacting with games goes beyond the user interface. It’s also about the user experience and overall interaction design. I already mentioned The Last of Us Part II in my last blog post, focusing on the vast variety of subtitle adjustment options. This is just one out of over 60 different accessibility options [5].

Their design philosophy follows a “sensory redundancy” model. This means that if a player can’t see the path, Navigation Assistance uses haptic pings and 3D audio cues to guide them. If a player can’t hear an enemy, Awareness Indicators and Combat Vibration Cues translate sound into visual and tactile data. This really showed me how expansive this theme can get once we look at the broader spectrum of the interaction between the game and the player.

I really like playing games.
Leave a Reply

Your email address will not be published. Required fields are marked *