Here's a quick post on screen readers and React Native apps. I worked on my first ever accessibility bug this past week and was introduced to the world of screen readers. I've still got a long way to go in terms of learning the ins and outs of how to design and create mobile apps that are accessible, but it's nice to finally dip my toes into the water. Accessibility is something I'm really keen to learn more about, so I'm hoping that I'll get the chance to improve my knowledge on this topic and blog more about it in the coming months.
So the bug reported was that our latest app release stopped a user using iOS's Voice Over function (screen reader) from progressing through a particular screen. I was assigned this bug ticket, and whilst bewildered at first, managed to work it out fairly quickly once I understood the basics of how the React Native API interacts with screen readers.
Here are the steps that I took:
- I don't use XCode to develop the app. I do however, use the iPhone emulator that comes with it. Unfortunately, the emulator does NOT allow for testing with the Voice Over function. [Side note: I did try to use the MacOS Voice Over function on the emulator window, but this was painful at best and not recommended.]
- Luckily, I use an iPhone myself, and so decided to do an XCode build and run the development app on my real device. I then switched on the Voice Over function in my phone's settings. The key thing to be aware of here is that you'll immediately need to use the new gestures to navigate through your phone, which are simple, but frustrating to get used to if you've never used a screen reader before.
- Directionally swipe to move from accessible component to component.
- When focused on a component, double tap to "select".
- With the screen reader function on, I was able to replicate the bug. The issue was that the entire screen's contents were grouped as a single accessible component i.e. none of the child components could be individually selected by directionally swiping.
- With no real idea where to start, the React Native docs seemed a good bet. I didn't realise it immediately, but the first line was key:
[
accessible
prop] When true, indicates that the view is an accessibility element. When a view is an accessibility element, it groups its children into a single selectable component. By default, all touchable elements are accessible.
- I spent some time investigating the recent changes we had been making to the various components on the screen in question. We had been doing a lot of app-wide refactoring work recently, so the pro was that I knew it had something to do with that, but the con was not knowing which component was the culprit. Not appreciating what the root cause might be at first, I tried adding accessibility labels and props to various child components, to no avail.
Re-reading the docs again, it suddenly clicked. Basically, one of my team members had refactored our base Screen
component that wraps all the content in each of our screens recently. (The Screen
component basically styles all screens to a default design, with safe areas and keyboard avoiding views set as standard.) One of the core components introduced was TouchableWithoutFeedback
. As per the docs, "By default, all touchable elements are accessible". What this meant was that all child components of this component (i.e. the entire screen's contents!) are grouped into a single, selectable component meaning individual components could no longer be "selected". Ding ding ding, we have a winner.
The solution? A simple addition of an accessible
prop did the trick.
// Old version
<TouchableWithoutFeedback onPress={Keyboard.dismiss}>
<View style={styles.view}>{props.children}</View>
</TouchableWithoutFeedback>
// New version
<TouchableWithoutFeedback accessible={false} onPress={Keyboard.dismiss}>
<View style={styles.view}>{props.children}</View>
</TouchableWithoutFeedback>
Top comments (0)