Access to information has become fundamental human right and the internet being what it is, has become a bastion of information. The creation off of web pages and web apps has been the creative domain of programmers and software engineers of which 1.7% are blind (StackOverFlow Survey 2018). This blog aims to look into that part of the developer community.
In the 1980s screen reader has became a prevalent tool for for developers with limited or no sight as it was introduced as a tested and proven tool at IBM by Jim Thatcher, a mathematician at said company. Like all great problems, this problem had a personal connection to him.
Thus, He set out to solve this problem and came up with a solution(read about that process here). Over time at IBM, screen readers or “talkies” as they were known evolved (with help from staff who are blind as beta testers ). At that time its ability to improve accessibility was seen was not seen as marketable (remember this was the 80s), its ability was well proven by the beta testers and then early adopters inside the company to improve improve accessibility to information and computers. Gradually screeners became a primary tool for visually impaired developers and found itself release as Screen Reader/2 in the mid 1990s as it was then deemed commercially viable.
He is a 26 year old developer and music lover. He is also completely blind and an avid contributor to FreeCodeCamp.
He used regular equipment that has special attachments, in his own words:
I like this question, because it allows me to immediately explain how blind people actually use computers.
A lot of people are under the impression that blind people require specially adapted computers in order to get anything done. Even some of my fellow Visually Impaired Persons (VIPs) tend to think this.
Well let me debunk this myth right here and now. I am currently typing this on a normal Dell Inspiron 15r SE notebook, which can be bought in any laptop store that sells (somewhat less recent) laptops. The machine runs windows 8 (not my personal choice, but UEFI is too much of a pain to downgrade). All I did to adapt it was install an open-source screen reader called NVDA.
A screen reader basically, at its most basic level — wait for it — reads the screen. It tells you the textual content of the screen with a synthesized text-to-speech Siri-like voice. Screen readers also allow for the use of a braille display, a device that consists of a line of refreshable braille cells that can form letters according to what content is highlighted on the screen.
This is really all the adaptation a blind computer user needs. Using this program, I can do many things you probably wouldn’t imagine being able to do with your eyes closed, such as:
+Browsing the web using Firefox
+Writing up reports in Microsoft Word.
+Writing up snazzy blog posts.
+Recording, editing, mixing and publishing audio (My hobbies include
singing and making music).
+Using audio production apps like Reaper, Goldwave, Audacity and Sonar
+Coding websites and applications using Eclipse, (the ironically named)
Visual Studio, and good old NotePad++.
The reason I’m naming all these mainstream technologies is to show you that I can use them just like people who aren’t ocularly challenged.
If you’re writing the next big application, with a stunning UI and a great workflow, I humbly ask you to consider accessibility as part of the equation. In this day and age, there’s really no reason not to use the UI toolkits available. It’s a lot easier than you may think. Yes, these include the Android Activities, iOS NsViews and HTML5 widgets you may be thinking of.