Focusable elements Must

All interactive elements must be focusable and non-interactive elements must not be focusable.

Some people may only use a keyboard, switch device or voice control for navigation and input. For example, astronauts in space struggle to use a mouse, track pad, or touch screen because these require gravity.

In order to operate an interactive element, a user must first be able to move focus to the element via any input device (keyboard, mouse, touch, voice, switch device etc.).

Focus is defined differently depending on the platform. For example, iOS focus is limited to elements that support keyboard input and VoiceOver focus also requires accessibility to be enabled and proper frames defined

Examples

iOS

Ensure accessibility for all content that has meaning and functionality. Assistive technologies provide focus to all objects that have accessibility enabled. This standard supports input methods based on iOS platform level assistive technologies such as Assistive Touch, Switch Support and VoiceOver. It relates back to the principle of supporting platform accessibility settings and features.

Android

Ensure all active elements can receive focus from assistive technology and accessible input methods by setting the focusable attribute for the field to ‘true’.

HTML

Use standard HTML elements that provided keyboard access automatically. For simulated element ensure keyboard focus is set using tabindex.

For inactive elements, ensure focus is disabled for controls that supported the disabled attribute.

Testing procedures

iOS

  1. Activate a screen reader.
  2. Verify that each actionable object can be accessed directly (by touch) and appears in the focus order of the view.
  3. Verify that each actionable object can be focused with a screen reader by navigation (swipe gestures).

Android

  1. Activate a screen reader.
  2. Verify that each actionable element can be navigated to directly (by touch Android 4+).
  3. Verify that each actionable element can be navigated to using the keyboard or d-pad.

Testing results

iOS

The following checks are true:

  • Each actionable object can be accessed directly (by touch) and appears in the focus order of the view;
  • Each actionable object can be focused with a screen reader via swipe gestures.

Android

The following checks are true:

  • Each actionable element can be focused directly (by touch);
  • Each actionable element can focused/navigated to using the keyboard or d-pad.

Note: Android OS’s below 4 do not trap touch events so users of some assistive technologies such as Talkback cannot directly touch the screen without activating elements. Use a d-pad, trackball or keyboard to test the sequence order for all Android OS’s.

Keyboard trap Must not

There must not be a keyboard trap.

If using a keyboard or other non-pointer input, user focus must be allowed to progress and not become trapped. All focusable elements must be accessible.

Any modal components that open from a user action should keep focus within the component and must provide a means to close or dismiss the component, which would return focus to the trigger element. For example, on-screen keyboards, information panels, or full-screen media.

Any menu or drawer component that opens from a user action may follow the modal pattern, or may automatically close or dismiss the component and return focus to the trigger element after the user moves focus onward from the last element. For example, a drop-down menu, side-drawer menu, or accordion panel.

Jump to:

Examples

Testing

Advanced

Examples

iOS

Controls should freely receive focus and allow the user to move to another control. The user must be able to move focus into and away from the control as much as needed while the user is reviewing the screen contents.

On-screen keyboards or other pop-up components that appear when a control gains focus should be dismissible to ensure the user can navigate to other controls that may have been covered by the objects.

Android

Controls should freely receive focus and allow the user to move to another control. The user must be able to move focus into and away from the control as much as needed while the user is reviewing the screen contents. On-screen keyboards or other on-screen components that appear when a control gains focus should be dismissible to ensure the user can navigate to other controls that may have been blocked by on-screen elements.

Note: HTML content within an app (banner adds, Terms and Conditions, Maps etc.) can create a focus trap for screen reader users with Android 4 or earlier where support for HTML is poor.

HTML

Do not trap focus via JavaScript, onBlur, onChange, onFocus, or other custom focus code, or using embedded elements that may trap focus.

Testing procedures

  1. Activate a screen reader.
  2. Navigate to an actionable object, element, or control.
  3. Attempt to navigate away from the item via a navigation method when focus is on the item.
  4. Ensure that the focus moves out of the item.
  5. If focus does not move out of the item with the standard gesture or method, ensure that a method for moving the focus away from the item is described visually and by a screen reader.

Testing results

The following checks are true:

  • Object, elements, or controls can be navigated away from, through or over with a standard navigation method;
  • A method to navigate away from the item is described in a visible fashion and through a screen reader and the method works to move focus past or over the keyboard trap.

Content order Must

Content order must be logical.

All users benefit when content is logically ordered, in particular users of assistive technology that follows the flow of the page or screen.

Assistive technology such as screen readers will read through a page or screen in content order, regardless of the layout. However, expert users may jump between elements such as headings and move forward or backward from that point.

Examples

iOS

iOS sets reading order internally based upon screen layout. Developers must override accessibilityElementAtIndex, accessibilityElementCount and indexOfAccessibilityElement to change the order. This is the UIAccessibilityContainer informal protocol. All elements presented by it must be direct sub views of the main view, but it can contain other views that conform to the protocol and thus the developer can implement an ordering of elements of their own choice.

Android

The ordering of the focus movement is based on an algorithm which finds the nearest neighbour in a given direction. If the default algorithm does not match the intended behaviour explicit overrides can be by provided using XML attributes in the layout file:

  • nextFocusDown
  • nextFocusLeft
  • nextFocusRight
  • nextFocusUp

or use setNextFocusDownId() and setNextFocusRightId(), etc.

HTML

Screen reader software renders content in the order in which it appears in the document object model (DOM). Place content in the DOM in the correct order either through source code or inserting nodes in the DOM. If content must be presented different visually from the logical reading order then use CSS to change the position of the content.

Testing procedures

  1. Activate a screen reader software.
  2. Navigate using standard commands for next and previous.
  3. Verify that the content is announced in a meaningful sequence.

Testing results

The following check is true:

  • The content is announced in a meaningful sequence.

Focus order Must

Actionable content must be navigable in a meaningful sequence.

Mouse or touch users determine the order in which they interact with actionable elements. Keyboard and screen reader users depend on the focus order provided by the content. For example, navigating a form can be disorientating if the sequence jumps between unrelated elements.

Content order will normally dictate focus order. However, this may not always be the case. Actionable content must follow a logical sequence that will maintain the meaning and operation of the content.

Consideration should be given to anything that programmatically moves the focus for customised or enhanced features, e.g. menu drawers that close after moving focus onward from the last item.

Examples

iOS

Focus order in iOS is determined by physical layout of controls/views. Developers can reorder code, or control and change focus order by using the UIAccessibilityContainer protocol if they need to.

Android

The focus order will fall to the nearest, neighbouring, focusable element by default. Sometimes this may not match what is intended and as such explicit overrides may need to be provided.

This can be done by using the following:

  • android:nextFocusDown defines the next view to receive focus when the user navigates down
  • android:nextFocusLeft defines the next view to receive focus when the user navigates left,
  • android:nextFocusRight defines the next view to receive focus when the user navigates right,
  • android:nextFocusUp defines the next view to receive focus when the user navigates up.

HTML

The following will ensure a logical content order:

  • Code according to tab order;
  • Be aware tabIndex (positive, '0' and negative) may not be supported in mobile browsers;
  • Do not use tables for layout purposes.

Testing procedures

  1. Activate the application with a screen reader.
  2. Navigate through the active on-screen object, elements, and controls.
  3. Verify that the focus order is equivalent to the intuitive visual reading order of the page.
  4. Select radio buttons, checkboxes and other actionable object, elements, and controls.
  5. If additional item appear or become enabled, determine if these items are later in the focus order. Newly appearing fields should appear later in the focus order.
  6. Ensure focus moves forward and backward in an intuitive manner.

Note: Android has a focus emulator that can be used in the absence of a directional controller.

Testing results

The following checks are true:

  • The focus order is equivalent to the intuitive visual reading order of the page;
  • When additional items appear or become enabled, these items appear after the item that activated them;
  • Focus moves forward and backward in an intuitive manner.

User interactions Must

Actions must be triggered when appropriate for the type of user interaction.

Users will use a variety of input methods, sending different signals that can be listened for programmatically to trigger actions. This could be moving a mouse, touching a screen or pressing a key. It could also be using other controllers or assistive technology to mimic these interactions.

For mouse, touch and other pointer style interactions the most appropriate trigger will be a high level “click” event or an event at the end of the interaction. This allows users to change their mind and adjust focus, without being forced to commit to an action until the clicked mouse, or touch is removed.

For keyboard style interactions the most appropriate trigger will be a high level “keypress” event or an event at the start of the interaction. These users have already chosen focus.

Which trigger is most appropriate may vary for some interactive content.

Examples

iOS

By default standard objects will not respond when first touched and will only respond on touch being removed from an object this is mostly therefore an issue for custom controls implementing multiple touches. Controls that contain multiple touches have the ability to respond to and take action upon receiving a touchesBegan:withEvent. Developers should avoid taking action in this method.

A Multiple touch enabled view that does not take action upon receiving the touchesBegan:withEvent method is preferred over a multiple touch enabled view that does take immediate action upon the user touching the display in the touchesBegan:withEvent method.

Android

Care must be taken to avoid triggering touch controls on ACTION_DOWN, and instead trigger them on ACTION_UP.

On pointing devices with source class SOURCE_CLASS_POINTER such as touchscreens, the pointer coordinates specify absolute positions such as a view's x and y coordinates. Each complete gesture is represented by a sequence of motion events with actions that describe pointer state transitions and movements. A gesture starts with an ACTION_DOWN motion event that provides the location of the first pointer touch. As each additional pointer action goes down or up, the framework will generate a motion event with ACTION_POINTER_DOWN or ACTION_POINTER_UP accordingly. Pointer movements are described by motion events with ACTION_MOVE motion event. A gesture will end either when the final pointer goes up as represented by the ACTION_UP motion event or when the gesture is cancelled with ACTION_CANCEL. For more information, reference the Android Developer Reference – MotionEvent.

Each key press on the Android platform is described by a sequence of key events. A key press starts with the ACTION_DOWN key event. If the key is held sufficiently long that it repeats, then the initial down is followed by additional ACTION_DOWN key events and a non-zero value for getRepeatCount() . The last key event is an ACTION_UP for the key up. If the key press is cancelled, the key up event will have the FLAG_CANCELED flag set. For more information, reference the Android Developer Reference - KeyEvent.

HTML

If not defaulting to onClick or onKeyPress events, use onTouchEnd or onKeyDown event handlers to trigger actions, rather than onTouchStart and onKeyUp.

Testing procedures

  1. Navigate the app using the touchscreen.
  2. Navigate to the on-screen objects, elements, or controls.
  3. Begin to activate an item (touch it without lifting your finger or stylus).
  4. Verify that the item does not immediately trigger an action/event.
  5. Finish activating the item (remove your finger or stylus from the screen).
  6. Verify that the item now triggers the action/event.

Testing results

The following checks are true:

  • Objects, elements, or controls do not trigger actions/events at the start of activation (when touched);
  • Objects, elements, or controls trigger actions/events when the user finishes activation (touch is removed).

Alternative input methods Must not

Alternative input methods must be supported.

Some users do not use the input control provided with a device, such as the touch screen, or mouse. Instead, they may use a switch device, keyboard or braille display.

Alternative methods of input and navigation that work with the platform must be supported to facilitate the needs of the user.

Interactive content must not rely on a single input method. For example, a carousel must not support only touch interaction, it must also support alternative inputs via visible focusable elements.

Examples

iOS

The Accessibility API provides alternative input methods for standard touch events. Focus control to elements is provided via the isAccessibilityElement property - this property should be set to YES to allow accessible input. VoiceOver also has some special gestures to consider, in particular the Escape and Magic Tap gestures. Refer to Apple Developer - Supporting Accessibility.

Android

Developers must ensure that all active elements can receive focus from assistive technology and alternative input methods. This can typically be accomplished by setting the focusable attribute for the field to true. For editable or read-only custom text elements that are developed by extending standard text elements you must ensure a system caret is set to indicate focus for the element.

HTML

Any HTML gestures must be supplemented with standard navigation methods such as keyboard focus, key presses, links, buttons, or other controls. For example, a drag and drop operation may be supplemented by select elements allowing the user to choose multiple combinations.

Testing procedures

  1. Activate a screen reader and physical keyboard.
  2. Identify the active screen objects, elements, and controls.
  3. Ensure that all items can be navigated to via alternative input methods.
  4. Ensure that the items can be activated via alternative input methods.
  5. Activate the item.
  6. For items with complex functionality, check for equivalent methods of action support such the arrow keys to instead of swipe up and down gestures to move a slider.

Testing results

The following checks are true:

  • Objects, elements, and controls can be navigated to via alternative input methods;
  • Items can be activated and manipulated via alternative input methods.