Re: Amateur programmer, looking to create accessible programs


 

Hi all,

As Luke and others pointed out, there are lists dedicated to programming, and in case of NVDA specifically, there is NVDA Development list. I think both lists are good, but that won't solve the issue the original poster is looking into, and you can indeed create accessible GUI-based applications (although counterintuitive, NVDA is an accessible GUI-based application if you think about it). What I'm going to write comes from my experience as a programmer who spent years working on screen readers and have been advocating for accessibility and usability (like Luke, I'm blind, although I was a low vision user until my early teens, which was early 2000's):

As you may know, the first task of programming is looking for a problem to solve. The fact that you wish to write an app that is accessible is very notable, in that you may have found some issues you wish to solve by writing apps, along with looking at app design at the same time. So I'll assume you did your first task, so let's move onto design and accessibility aspects.

Accessibility is about designing products so it is approachable by different audiences (the task of actually using such products with help from assistive tools falls under "usability"). The question to ask when designing products with accessibility in mind is, "what are limitations and workarounds specific audiences need, and how can the product help bridge the gap that might be present for audiences?" For people with disabilities, the question falls under limitations of specific disabilities and tools that can expose your product functionality to specific audiences; for blind people, the obvious choice is using tools to help folks "see" screen content i.e. screen readers, magnifiers, color contrast, and so on; for deaf communities, using text to convey sounds, sign language output and what not. Then you would look for a way to make programs expose needed information so audiences (users) can use your product effectively, and one common scenario is using accessibility API's to communicate information to users of assistive technologies.

In GUI programming (something that's possible for blind people to do although with assistance if required), one would design data representation style (specific GUI controls for things such as text, forms, and many others). Although things may look colorful and intuitive for the majority (the term "majority" depends on language, country, and culture), without effort from humans and tools (along with mindset), the product would not be discoverable (wqord of mouth, review,s etc.), approachable (promotion, demos, etc.), and accessible (these three things must work together when accessibility is concerned, because people with disabilities are some of the most neglected communities when it comes to access to information (what I would term "information blackout"), although that is changing).

So to enhance how the product is seen by people with disabilities and to make them accessible (and usable), API's such as Microsoft Active Accessibility, UI Automation, IAccessible2 were created to help programmers design products with accessibility and usability in mind. These API's consist of at least three parts:

  1. Client/consumer: an assistive technology such as screen readers (including NVDA) is an accessibility client. The job of a client is to ask accessibility API's for information about a control a user is working on, and to perform specific actions required to help people use specific applications such as reading state changes.
  2. Server/producer: the application in question is a server because it serves clients by exposing crucial information for use by different assistive technologies. For screen readers, this means using text labels for graphical buttons, using facilities such as accessibility events to communicate activities such as screen content changes. How such info is communicated to users is the job of the client (assistive technology), and it is up to users as to what to do with information coming from the app.
  3. Accessibility bridge: API's such as MSAA and UIA serve as a bridge between servers (apps) and clients (assistive technologies). The job of accessibility API bridges is to serve as a "middle man" between users and apps by exposing server-side information (whatever the app says) in a way clients can understand, process, and present to users. At the same time, bridges accept interaction tasks (such as keyboard input) from users, communicates these facts to applications, and see what the app says.

A basic grasp of accessibility concepts is one of the steps involved in improving app accessibility (the first obvious step is understanding the culture the target audience comes from, a task you have accomplished well based on the original post). The next task is actually using assistive technologies and apps to better understand what folks are talking about. After that, it comes down to designing programs in a way that is accessible for diverse audiences such as adding labels for GUI controls and using accessibility API's to expose needed information (if using GUI toolkits, I recommend using ones known to have high accessibility marks such as wxWidgets and more recent versions of QT and WinUI/XAML). And don't forget to test your ideas with target audiences (testing, gathering feedback, etc.) early because it is more costly to "improve" accessibility later.

Before I close, one thing you may wish to ponder: if you think carefully about it, NVDA and other screen reader friends are sophisticated data processors. Their job is to gather needed information for blind people with help from facilities provided by the operating system + accessibility API's + apps + standards, process gathered information in a way suitable for presentation through multiple channels (speech, braille, sound, etc.), and presenting information to users. That's the core of screen readers, and when folks talk about screen reader development, we are talking about refining these elements (supporting newer accessibility standards, dealing with apps with no control labels, support for text-to-speech engines and braille displays, keeping an eye on operating system changes, etc.). Of course folks can customize screen readers to their liking (settings, code, add-ons, etc.). at the same time, app accessibility and usablity falls upon the responsibility of app developers, made better when they collaborate with users (this is why I always ask users to send feedback to developers to point out possible accessibility improvements).

Hope this helps a lot.

Cheers,

Joseph

Join nvda@nvda.groups.io to automatically receive all group messages.