Auditory Accessibility in Games
A practical guide to auditory accessibility in games, from relevant laws, to low-scope steps, to more ambitious and flexible ways to support d/Deaf and hard of hearing players.
This is the first post in a series I'm doing on the various specific categories of accessibility in games, acting as both a contextual primer and practical guide. To kick things off, I'll be diving into auditory accessibility.
So what is auditory accessibility? In short, "auditory accessibility" describes how well a game fully satisfies and accommodated the needs of someone with any degree of d/Deafness, hearing impairments, or who are hard of hearing (HoH). One in eight people has some degree of hearing loss in both ears, and even more have impaired hearing to a lesser degree. It's an incredibly common need that people have, and if you're making games you should support it!
As game designers, our goal is for our players to have fun! But if certain accessibility considerations aren't fulfilled, nearly half a billion gamers will be left out of the fun. Staying informed about accessibility topics and integrating your awareness into the games you make is one way to ensure that as many people as possible can experience our games to the fullest extent.
Accessibility Laws
While accessibility laws differ across borders, I'll be approaching this from the perspective of the United States. Chances are, if you're making games for widescale distribution, you're probably selling them in the U.S., so these laws are at least partially relevant across Β the board.
Numerous entities, mostly in the communications and broadcasting space, have a legal obligation to support and deliver accommodations such as captioning to ensure equal access and participation in forms of content for people who are d/Deaf or hard of hearing. Games falls into that category.
Certain laws such as the 21st Century Communications and Video Accessibility Act (CVAA) pertain to games in this regard. Just like television content and streaming video, games are required to be "accessible to and usable by individuals with disabilities". While there's certain provisions in the CVAA that give some flexibility in how achievable those accommodations are for the company or team creating the video game, the intent and direction is clear: the future of games is accessibility.
Aside from any legal or business implications of supporting accessibility in games, as a game designer it just feels right to ensure people have the necessary accommodations to fully enjoy a game. Games are important ultimately because fun is important, and I personally want as many people to have fun with my games as possible.
If you're curious about learning more about when Captioning is legally required, the National Association of the Deaf has an in-depth resource here, as well as a guide on the CVAA.
Subtitles & Captioning
If your instinct was to immediately think of subtitles, you're not entirely wrong! Subtitles are pretty ubiquitously useful. Ubisoft reports that upwards of 95% of players of Assassin's Creed Odyssey kept subtitles on; that's huge! But auditory accessibility is so much more than subtitles.
For starters, subtitles are usually intended to supplement on-screen content, and aren't meant to be the sole way someone interacts with the audio layer of something. Subtitles generally don't include background sound descriptions, leaving out things like birds chirping, rushing streams, and other details that users with unimpaired hearing might take for granted. Subtitles can be turned off as needed.
Closed Captions, on the other hand, are specifically meant to improve the experience of viewers with hearing impairments, or who aren't able to use the audio portion of content. Closed captions match spoken audio, background sound descriptions, and can be closed (turned off) as needed.
Open Captions are generally the same as closed captions, but are rendered directly into the video file itself, essentially making it impossible to turn them off.
In general, you should aim to support Closed Captions in your game. But as I'll get into later in this post, there's even more you can do, which can be an absolute game-changer for gamers with hearing impairments.
To summarize the three types of captioning:
- Subtitles can be toggled off, but don't describe all the sounds in a scene, and aren't always accurate to all spoken content.
- Closed captions can be toggled off, and intend to describe all important sounds in a scene, including dialogue and important environmental sounds.
- Open captions can't be toggled off, but otherwise strive for the same standards as closed captions.
If your game has spoken dialogue, important environmental audio, or gameplay-critical sound effects, you absolutely need to support closed captions. Subtitles are fine for noncritical dialogue; but for anything the game depends on, anything less than quality closed captions will leave some players without the necessary context and information needed to fully enjoy or progress in the game.
Going Beyond Closed Captions
Alright so, Closed Captions - check! Where do we go from here? As it turns out, there's a ton of other considerations in how your game supports users with hearing impairments.
Localization can be an overlooked aspect of how effective Closed Captions are. Some languages, such as Arabic, Italian, and Japanese, are null-subject languages β that is to say, the subject in sentences are often implied contextually. For example in Japanese, sentence structure largely requires a topic or theme to make sense. If you're curious about null-subject languages, you can read more here.
If you're just chucking your game's text files into Google Translate, the nuances of context might be lost, dramatically changing the meaning of the text shown to users in the Closed Captions. This is especially important if you're using Closed Captions to convey any information about the story, gameplay mechanics, or what a player needs to do. With bad translations being such an easy-to-avoid and common misstep in game development, there's all the most reason to ensure your captions are accurately translated.
How closed captions are designed matters too. Letting players change the size, backgrounds, text color, and position of closed captions can mean the difference between the additional text on-screen being useful or noisy. In general, closed caption placement should be taken into consideration as part of UI design, and shouldn't be some "extra" element that gets added without due thought.
However, Closed Captions can't do everything! Imagine reading "[Large Explosion]" in Closed Captions β unless you tell me where that explosion came from, that information is nearly useless! This is where spatial audio visualizations can help.
Spatial audio visualization is an emerging concept that represents projecting some sort of graphic or VFX into the game world to represent the source, intensity, and type of audio in the game world. As an emerging accommodation concept, the best practices in audio visualization are still forming.
Take Fortnite, for example. Footsteps and other in-game audio can be toggled-on to be portrayed with an audio-indication ring in the center of the screen. It's thoughtfully integrated into the size and shape of the weapon wheel, and has various types of modulation that will trigger in response to various sound types and intensities. If you're curious about Fortnite's accessibility in detail, check out this incredible in-depth review by CanIPlayThat.
Finally, if your game lets teammates or opponents communicate with each other, ensuring that you're not relying on plain audio transmissions can help too. If a player is d/Deaf or hard of hearing, voice chat can be an isolating factor. Their teammates might become frustrated that they're not listening to their calls for help, leading to potentially harmful interactions. Thankfully, there's a ton of solutions to help in this scenario.
While only a partial solution, making sure that all speech is transcribed can ensure there's at least a shared baseline between all players, so everyone has a record of what was said even if someone was unable to hear it be verbally spoken. SpeechLib by George Birbilis and Speech Studio by Microsoft are two utilities (among many) that can help with this.
Branching out from speech as the medium, pre-defined reactions, chat options, and emotes can make communicating more accessible beyond just accommodating d/Deaf and hard of hearing gamers. Enabling communication in this way allows for carefully designed visual components, such as screen indicators or illustrative messages, to be shown in conjunction with any written component - which can naturally include any captioning details, such as points of interest being referenced. When done well, these sorts of flexible systems can add to a game's lore and enrich the overall feeling of play.
In Closing
Accessibility is a spectrum of accommodations that works together to ensure your game is as playable and approachable to as many people as possible. Most people use accessibility accommodations, but for some, it's the difference between being able to play a game, and being unable to join the fun.
As game designers, we have a duty to our players, and some easy low-scope features like closed captions can open a game up to an entire community of d/Deaf/HoH folks. In terms of where the industry is headed, there's a huge amount of work to be done in making games more accessible. My hope is that by sharing guides like this, I'm doing my fair part in helping that work see the light of day, opening up more games to more people.
Cheers βπΌ
Arman