Apple and Twitter have both been in the news recently because of the actions they’ve taken — or not taken — on accessibility, and the differences between the two are a study in contrasts.

Twitter rolled out audio tweets, a feature it claims will “add a more human touch to the way we use Twitter” by enabling users to tweet voice recordings. But is it fair to use the words “more human” and “the way we use Twitter” about a feature that excludes the 466 million people worldwide with hearing loss? No. And it’s why Twitter is an example of what not to do when it comes to digital accessibility.

Twitter’s Big Misstep

Where did Twitter go wrong when releasing this new feature? They failed to build in captioning of voice tweets either automatically or manually, a feature that is essential for people who are 1) deaf or hard of hearing; 2) in a noisy environment; 3) in a situation where audio playback would be socially rude; 4) not native speakers of the tweet’s language; or 5) living with ADHD.

Given how many people are impacted, it’s surprising Twitter didn’t anticipate the widespread criticism from the accessibility community and the negative press the announcement sparked. To make matters worse for Twitter, the incident brought to light the fact that Twitter does not have a team dedicated to accessibility (see this tweet thread), relying instead on individuals who volunteer their time to help. This, too, is surprising given how much we’ve seen other big tech firms like Adobe, Apple, Facebook, Google, and Microsoft invest in building accessibility programs and teams.

Twitter issued an apology, saying: “This is an early version of this feature and we’re exploring ways to make these types of Tweets accessible to everyone.” But it revealed Twitter’s poor understanding of accessibility. The reality is that companies where accessibility is an afterthought:

  • Damage their revenues by missing out on part of their potential customer base — millions of people on a permanent basis, in this case, and every single user on a situational basis.
  • Waste money because remediating a feature to make it accessible after the fact typically costs more than building accessibility in from the start.

Apple Continues To Get Accessibility Right

Let’s contrast Twitter’s approach to Apple’s. Apple’s commitment to diversity, inclusion, and accessibility was clear once again at their Worldwide Developers Conference (WWDC) this week. I saw it in the diverse set of speakers on stage, the closed captioning and audio descriptions available for listeners at the keynote, and the fact that this year’s event was free to all. But if you only watched the keynote, you missed many of the announcements that demonstrate Apple’s continued focus on digital accessibility. Here’s just three of many that I’m excited about:

  1. “Back Tap” beta feature in iOS 14. With a single, double, or triple tap on the back of an iPhone (even with a case on), a user can trigger a customizable shortcut — like opening a specific app or launching VoiceOver, Apple’s built-in screen reader technology. This feature is cool for anyone who wants to save time and steps to initiate a key task. But it’s particularly useful for people with upper mobility challenges who may struggle to navigate iOS with other gestures like swiping, or for people with cognitive impairments who benefit from simpler paths to common tasks.
  2. Sound recognition feature in iOS 14. This accessibility feature designed for people with hearing challenges listens for important ambient sounds, like a door knock, siren, smoke detector, running water, or crying baby, and alerts the user via a visual and/or haptic notification to the user’s iPhone, iPad, and/or Apple Watch. The potential lifesaving impact of features like this once again demonstrate Apple’s commitment to helping all its users.
  3. More cohesive experience across devices. With the launch of macOS Big Sur, Apple made improvements, as always, to design. The new OS creates a more consistent experience across the Apple ecosystem so users can transition between devices more easily. One example is bringing familiar iOS symbols and icons into macOS. They’re also focusing on reducing visual complexity. Consistency and reduced complexity helps everyone, but especially users with anxiety or other cognitive challenges who benefit most from consistency as they move across their devices, avoiding unnecessary distractions.

 

Three Takeaways

  1. Build in accessibility from the start. Don’t be like Twitter. Check out my blog post The Inclusive Design Imperative: Win And Retain More Customers for practical advice on what steps to build into your design process to create more inclusive experiences.
  2. Apply common standards across experiences. Apple’s announcements mean we’ll see more consistency across devices. That’s exciting because a more fluid experience means less cognitive load for users and experiences that just feel easier. I’ll be publishing research soon on design systems, a key tool to scale design decisions across experiences. In the meantime, check out my blog post You Need A Design System — Here’s Why for an overview.
  3. Educate and train your digital teams on accessibility. Give the teams designing and developing your digital experiences the training they need to create accessible experiences. For your teams working on iOS apps, check out Apple’s five accessibility-focused sessions at this year’s event. I really liked this one on creating visual experiences that are inclusive for everyone.

To hear more about what we’re excited about from Apple’s WWDC this week, check out my colleague Karine’s blog post Apple Takes An Inspiring Stand On UX, Privacy, And Diversity At WWDC 2020 and my colleague Julie’s blog post Apple WWDC 2020: The Powerful Combination Of Context And Design.