Alternatives for audio and visual content Must

Alternative delivery, such as subtitles, sign language, audio description and transcripts, must be provided with embedded media when available.

Some users may not be able to hear audible content. Some users may not be able to see visual content. Having an alternative delivery, in addition to the media that can be perceived, supports comprehension. Multiple delivery formats also help cognitively impaired users.

This guideline applies to all forms of audio and visual content, regardless of length or format, as appropriate for the content:

  • Subtitles must be provided where feasible, when they were included with a pre-recorded broadcast, or if content is for public facing corporate communications, employment or suppliers;
  • Audio described or sign language versions must be provided if they were included with an original broadcast;
  • For interactive content, such as e-learnings, narrative and instructions must be available both visually and audibly, for example using subtitles;
  • In addition to other alternative delivery, transcripts can be provided for all types of audio and visual content.

Examples

iOS

iOS provides support for multiple audio tracks (which can be used for audio description) and caption support using the standard video playback views and a supported codec. Captions may not be available in full screen mode. Use the Apple Media player with closed or open captions and support audio description.

Android

Android does not handle multiple audio tracks or captions automatically therefore developers must incorporate support for these features into their application using a codec supported by the Android MediaPlayer API or a custom player. Incorporate closed captions and audio description into a multimedia video.

HTML

Use a media player that supports captions/subtitles. Provide correctly labelled controls to turn on and off the captions/subtitles. Flash is not supported on many devices and thus HTML 5 based media players will likely need to be used to deliver captions and audio description via HTML.

There are several different options to deliver captions and audio descriptions, including different caption formats. The caption format may be dependent on the media player used. The audio description format can either by text synchronization (this is not currently supported), a secondary audio file, or a secondary video with an enhanced audio track.

An accessible media player should be used for all embedded video or audio content.

For other interactive media use visible text and cues to support audio narrative or instructions, and ensure this is provided as text or as a text alternative so it can be spoken by screen readers.

Testing procedures

The key aim will be to verify that:

  • any audible content necessary for understanding is provided visually;
  • and any visual content necessary for understanding is provided in a way that can be audible.
  1. Activate the app or webpage.
  2. Locate media.
  3. Determine if the media has audio content that contains important information - such as a spoken narrative.
  4. Check that any audible information necessary for understanding the media is also provided via subtitles/open or closed captions in conjunction and synchronized with the audio.
  5. Determine if the media has visual content that contains important information - such as a sign or new character entering.
  6. Check that any visual information necessary for understanding the media is also described as part of the audio or is provided through a separate track containing the audio descriptions and is synchronised with the video. This may be via a screen reader where appropriate.

Testing results

The following checks are true:

  • Media provides subtitles/opened or closed captions that are synchronised with any audio content that contains important information;
  • Visual content necessary for understanding the media is described using an audio which is synchronised with the video content (Video description), or where appropriate provides textual content for a screen reader.

Autoplay Must not

Audio must not play automatically unless the user is made aware this will happen or a pause/stop/mute button is provided.

Audio in AV and interactive content can be disruptive for screen reader users because it can conflict with and speak over the screen reader. Unexpected audio may also distress users with cognitive or sensory sensitivity.

Users should be given a choice to opt in for auto-playing content audio. Where a pause/stop/mute button is provided instead, it must be fully and immediately accessible.

Where play automatically continues to the next content item, this must be indicated in an accessible way, with a choice to opt out and sufficient time to do so.

User preferences should persist.

Jump to:

Examples

Testing

Advanced

Examples

iOS

Provide a button or link along with the controls for the audio playback that will start the audio. Do not call the play method directly after loading the audio in app unless creating the audio when initiated by the user.

Android

Video content should open in it's own screen using the native player. No action is needed for autoplay as Android automatically decreases the volume of playback in order to allow Talkback users to hear speech output. Consider forewarning new users.

HTML

Do not play audio automatically unless there is a setting where the user can opt in before content is autoplayed or the user is forewarned.

Testing procedures

  1. Activate a screen reader.
  2. Locate audio content that plays automatically.
  3. Check that the containing page does not play audio automatically when it loads while a screen reader is active, or that the user is pre-warned and a control is provided to stop or pause the audio.

Testing results

The following checks are true:

  • Audio content does not play automatically when a screen reader is running.
  • The user is pre-warned that audio will automatically play and there is a control to stop or pause the audio.

Metadata Should

Relevant metadata should be provided for all media.

Metadata, information about an item, can help people to find what they require. Metadata provided with media content can help users understand the media and locate alternative versions.

Relevant metadata might include duration, and the presence of subtitles, sign language, or audio description.

Jump to:

Examples

Testing

Examples

iOS

Not applicable.

Android

Not applicable.

HTML

Metadata can be used to associate accessible alternate versions of Web pages with pages or content that does not meet the standards and guidelines. It can also be used to locate and describe alternative media. Use the link element with rel and type attributes.

Testing procedures

  1. Locate media.
  2. View the source of the page.
  3. Verify that metadata content is indicated in the head section of the page and indicates where alternatives to language or the media are located.

Testing results

The following checks are true:

  • Correct metadata is provided for media.
  • Correct metadata is provided for language.

Volume control Should

Separate volume controls should be provided for background music, ambient sounds, narrative and editorially significant sound effects.

Separate volume controls, in addition to the mute control, should be provided in settings for interactive media, such as e-learnings, to minimise the risk of sensory overload for users with audio sensitivity:

  • Users with cognitive impairments that include audio sensitivity need to be able to minimise the risk of sensory shock.
  • Users with mild to moderate hearing impairment may need to adjust different audio elements to hear the narrative speech clearly.
  • Screen reader users need to be able to hear the screen reader over the sounds within interactive media.

Examples

iOS

Information to be confirmed.

Android

Information to be confirmed.

HTML

Information to be confirmed.

Testing procedures

  1. Locate media.
  2. View the source of the page.
  3. Verify that metadata content is indicated in the head section of the page and indicates where alternatives to the media are located.

Testing results

The following checks are true:

  • All sound can be muted independently of the screen reader;
  • Where appropriate the volume of different aspects of audio can be controlled and modified independently.

Audio conflict Should not

Narrative audio in games, e-learnings or interactive media should not talk over or conflict with native assistive technology.

In order to interact with embedded media, users need to perceive the editorial narrative and/or instructions.

If the embedded media is self-voicing content, then this should be hidden from the screen reader. If the embedded media is providing content to the screen reader, then this should not be self-voicing.

Note: Further research is currently being conducted to ascertain user preferences.

Jump to:

Examples

Testing

Examples

iOS

Use the starts-media-session trait to silence the audio output of assistive technology, such as VoiceOver, during a media session that should not be interrupted. For example, use this trait to silence VoiceOver speech while the user is recording audio.

This trait may not be always appropriate. Some situations may require VoiceOver speech during a media session with audio.

Android

Android automatically decreases the volume of playback in order to allow Talkback users to hear speech output.

HTML

Information to be confirmed.

Testing procedures

  1. Locate media.
  2. Enable screen reader.
  3. Ensure screen reader can be heard and does not clash unnecessarily with any audio in the media.

Testing results

None documented