Practical tips to confidently support VoiceOver and Voice Control

Duncan Babbage, Intro Limited (@dbabbage)

Supporting all of the accessibility features of the iOS, iPadOS and macOS platforms is best practice and is good for all users. These platforms provide a lot of assistance for developers out of the box, which is great, but not everything is automatic. As developers it is important to verify we’re implementing accessibility features well. With features like Dynamic Type, it’s relatively easy for a developer to see when things are broken in an app—dial up a larger text size and it’s usually pretty clear. In contrast, supporting VoiceOver and Voice Control can feel a lot more intimidating. As a developer with relatively intact vision, it’s easy to feel uncertain what ‘good’ sounds like, and to know how to tell if you’ve missed some better practices that might make a big difference to users who are blind or have low vision. This talk presents examples of what a great VoiceOver experience sounds like, and practical tips for how to implement it. Each is illustrated with specific examples—drawn from both my own development and examples from experts in the field. Go away confident you can identify and fix a number of potential issues with VoiceOver and Voice Control in your own apps, as well as having an overview of the new Voice Control features in iOS 13, iPadOS 13 and macOS Catalina.

About Duncan Babbage

Duncan Babbage, PhD is indie developer of Intro, an app for keeping track of things you want to remember. Duncan describes himself as a recovering clinical psychologist and recovering academic. He is available for contract iOS development and also works as an independent health services consultant.

Apple, Mac, iOS, macOS, AppleTV, iPhone, and iPad are trademarks of Apple Inc. For the avoidance of doubt, Apple Inc. has no association with this conference.

/dev/world is an AUC event.