Amateur programmer, looking to create accessible programs


arnoldsummerspbem@...
 

Hi,

I'm an amateur programmer who is sighted, but I am looking to create applications that are as accessible as possible. Are there any good resources for learning how to optimize one's programs for screen reader use? My particular focus would be NVDA, which is why I'm here. My strongest language is Python, though for various reasons I have been contemplating writing applications in C# and would prefer to use that. Naturally, I'm focusing on Windows environments for now. I am really at a loss trying to find developer guides for the purpose. I'm hoping to find suitable introductory material, but if I need to jump directly into documentation, I'll do that. I just don't know where to look, and I'm hoping that someone out there can provide some guidance. My ignorance of the subject matter is profound, but this is as good a time as any to educate myself, I guess.

Thanks!


Luke Davis
 

Hello and welcome!

The first thing you might consider, is also joining the nvda-devel mailing list. https://groups.io/g/nvda-devel
There are more sighted programmers there than you are likely to find here, I think.

I'm not sighted, and I don't program for Windows except in Python; and on Linux/Unix I don't program for the GUI. So I probably can't help you much.

But one of the things you can probably do, is stick to known accessible user interfaces like WX, and avoid those known to be less accessible like QT.

Other than that, ask around with specific questions--is "X" a platform/framework/library that results in accessible apps?

Pay attention to any accessibility interfaces provided by whatever frameworks/libraries you do end up using.

And of course, find yourself some blind alpha testers.

Sadly, I don't know of any specific resources for this, but again it's not an area I have much researched. Simply put, if anything I was doing wasn't screen reader accessible, I wouldn't be doing it.

Luke

arnoldsummerspbem@gmail.com wrote:

I'm an amateur programmer who is sighted, but I am looking to create applications that are as accessible as possible. Are there any good resources for
learning how to optimize one's programs for screen reader use?


tim
 

You might want to join program-l@...

There are a few c++ and Microsoft vips.

On 6/26/2021 9:36 PM, arnoldsummerspbem@... wrote:
Hi,

I'm an amateur programmer who is sighted, but I am looking to create applications that are as accessible as possible. Are there any good resources for learning how to optimize one's programs for screen reader use? My particular focus would be NVDA, which is why I'm here. My strongest language is Python, though for various reasons I have been contemplating writing applications in C# and would prefer to use that. Naturally, I'm focusing on Windows environments for now. I am really at a loss trying to find developer guides for the purpose. I'm hoping to find suitable introductory material, but if I need to jump directly into documentation, I'll do that. I just don't know where to look, and I'm hoping that someone out there can provide some guidance. My ignorance of the subject matter is profound, but this is as good a time as any to educate myself, I guess.

Thanks!


Rich DeSteno
 

Create programs that generate text on the screen, and not graphics, and you will have accessible programs for blind computer users.  This is especially true when the program operates in the Windows console and not in its GUI.  I have created and distributed many such games and they have been used worldwide by screen-reader users.  You can create such programs in virtually any programming language, including C#, C++, C, Python, and so on.  You can even enhance such programs by having them play wav sound files at various points, such as in games.


Rich De Steno
On 6/27/2021 10:27 AM, tim wrote:

You might want to join program-l@...

There are a few c++ and Microsoft vips.

On 6/26/2021 9:36 PM, arnoldsummerspbem@... wrote:
Hi,

I'm an amateur programmer who is sighted, but I am looking to create applications that are as accessible as possible. Are there any good resources for learning how to optimize one's programs for screen reader use? My particular focus would be NVDA, which is why I'm here. My strongest language is Python, though for various reasons I have been contemplating writing applications in C# and would prefer to use that. Naturally, I'm focusing on Windows environments for now. I am really at a loss trying to find developer guides for the purpose. I'm hoping to find suitable introductory material, but if I need to jump directly into documentation, I'll do that. I just don't know where to look, and I'm hoping that someone out there can provide some guidance. My ignorance of the subject matter is profound, but this is as good a time as any to educate myself, I guess.

Thanks!


udit pandey
 

hi arnold sir ,
i can help you in testing you can send me your programs and i will tell you if they are fine after running on my system
sir i have a question that is it necessary to take maths and science for become a programer
pls help me sir


On Sun, 27 Jun 2021 at 21:20, Rich DeSteno <axcruncher@...> wrote:

Create programs that generate text on the screen, and not graphics, and you will have accessible programs for blind computer users.  This is especially true when the program operates in the Windows console and not in its GUI.  I have created and distributed many such games and they have been used worldwide by screen-reader users.  You can create such programs in virtually any programming language, including C#, C++, C, Python, and so on.  You can even enhance such programs by having them play wav sound files at various points, such as in games.


Rich De Steno
On 6/27/2021 10:27 AM, tim wrote:

You might want to join program-l@...

There are a few c++ and Microsoft vips.

On 6/26/2021 9:36 PM, arnoldsummerspbem@... wrote:
Hi,

I'm an amateur programmer who is sighted, but I am looking to create applications that are as accessible as possible. Are there any good resources for learning how to optimize one's programs for screen reader use? My particular focus would be NVDA, which is why I'm here. My strongest language is Python, though for various reasons I have been contemplating writing applications in C# and would prefer to use that. Naturally, I'm focusing on Windows environments for now. I am really at a loss trying to find developer guides for the purpose. I'm hoping to find suitable introductory material, but if I need to jump directly into documentation, I'll do that. I just don't know where to look, and I'm hoping that someone out there can provide some guidance. My ignorance of the subject matter is profound, but this is as good a time as any to educate myself, I guess.

Thanks!


 

On Sun, Jun 27, 2021 at 12:33 PM, udit pandey wrote:
sir i have a question that is it necessary to take maths and science for become a programer
pls help me sir
-
This is not an appropriate line of discussion for the NVDA group.  Also, all you have to do is check with your academic institutions that offer computer science degrees, whether associate, bachelors, or higher to know that math and science are core parts of the curriculum.

Please limit this sort of discussion to the Chat Subgroup, where you already have a topic going.
--

Brian - Windows 10, 64-Bit, Version 21H1, Build 19043  

I do not understand why some seek to separate a person from their actions.  The self is composed of an individual’s thoughts, actions, and expression, which are contained in and actuated by the body.  What you do and say is the clearest indicator of who you are.

      ~ Brian Vogel

 


arnoldsummerspbem@...
 

Thanks everyone for your replies. I'll take all of this information into account, and I'll look into the other mailing lists mentioned.


On Sun, Jun 27, 2021 at 11:44 AM Brian Vogel <britechguy@...> wrote:
On Sun, Jun 27, 2021 at 12:33 PM, udit pandey wrote:
sir i have a question that is it necessary to take maths and science for become a programer
pls help me sir
-
This is not an appropriate line of discussion for the NVDA group.  Also, all you have to do is check with your academic institutions that offer computer science degrees, whether associate, bachelors, or higher to know that math and science are core parts of the curriculum.

Please limit this sort of discussion to the Chat Subgroup, where you already have a topic going.
--

Brian - Windows 10, 64-Bit, Version 21H1, Build 19043  

I do not understand why some seek to separate a person from their actions.  The self is composed of an individual’s thoughts, actions, and expression, which are contained in and actuated by the body.  What you do and say is the clearest indicator of who you are.

      ~ Brian Vogel

 


 

Hi all,

As Luke and others pointed out, there are lists dedicated to programming, and in case of NVDA specifically, there is NVDA Development list. I think both lists are good, but that won't solve the issue the original poster is looking into, and you can indeed create accessible GUI-based applications (although counterintuitive, NVDA is an accessible GUI-based application if you think about it). What I'm going to write comes from my experience as a programmer who spent years working on screen readers and have been advocating for accessibility and usability (like Luke, I'm blind, although I was a low vision user until my early teens, which was early 2000's):

As you may know, the first task of programming is looking for a problem to solve. The fact that you wish to write an app that is accessible is very notable, in that you may have found some issues you wish to solve by writing apps, along with looking at app design at the same time. So I'll assume you did your first task, so let's move onto design and accessibility aspects.

Accessibility is about designing products so it is approachable by different audiences (the task of actually using such products with help from assistive tools falls under "usability"). The question to ask when designing products with accessibility in mind is, "what are limitations and workarounds specific audiences need, and how can the product help bridge the gap that might be present for audiences?" For people with disabilities, the question falls under limitations of specific disabilities and tools that can expose your product functionality to specific audiences; for blind people, the obvious choice is using tools to help folks "see" screen content i.e. screen readers, magnifiers, color contrast, and so on; for deaf communities, using text to convey sounds, sign language output and what not. Then you would look for a way to make programs expose needed information so audiences (users) can use your product effectively, and one common scenario is using accessibility API's to communicate information to users of assistive technologies.

In GUI programming (something that's possible for blind people to do although with assistance if required), one would design data representation style (specific GUI controls for things such as text, forms, and many others). Although things may look colorful and intuitive for the majority (the term "majority" depends on language, country, and culture), without effort from humans and tools (along with mindset), the product would not be discoverable (wqord of mouth, review,s etc.), approachable (promotion, demos, etc.), and accessible (these three things must work together when accessibility is concerned, because people with disabilities are some of the most neglected communities when it comes to access to information (what I would term "information blackout"), although that is changing).

So to enhance how the product is seen by people with disabilities and to make them accessible (and usable), API's such as Microsoft Active Accessibility, UI Automation, IAccessible2 were created to help programmers design products with accessibility and usability in mind. These API's consist of at least three parts:

  1. Client/consumer: an assistive technology such as screen readers (including NVDA) is an accessibility client. The job of a client is to ask accessibility API's for information about a control a user is working on, and to perform specific actions required to help people use specific applications such as reading state changes.
  2. Server/producer: the application in question is a server because it serves clients by exposing crucial information for use by different assistive technologies. For screen readers, this means using text labels for graphical buttons, using facilities such as accessibility events to communicate activities such as screen content changes. How such info is communicated to users is the job of the client (assistive technology), and it is up to users as to what to do with information coming from the app.
  3. Accessibility bridge: API's such as MSAA and UIA serve as a bridge between servers (apps) and clients (assistive technologies). The job of accessibility API bridges is to serve as a "middle man" between users and apps by exposing server-side information (whatever the app says) in a way clients can understand, process, and present to users. At the same time, bridges accept interaction tasks (such as keyboard input) from users, communicates these facts to applications, and see what the app says.

A basic grasp of accessibility concepts is one of the steps involved in improving app accessibility (the first obvious step is understanding the culture the target audience comes from, a task you have accomplished well based on the original post). The next task is actually using assistive technologies and apps to better understand what folks are talking about. After that, it comes down to designing programs in a way that is accessible for diverse audiences such as adding labels for GUI controls and using accessibility API's to expose needed information (if using GUI toolkits, I recommend using ones known to have high accessibility marks such as wxWidgets and more recent versions of QT and WinUI/XAML). And don't forget to test your ideas with target audiences (testing, gathering feedback, etc.) early because it is more costly to "improve" accessibility later.

Before I close, one thing you may wish to ponder: if you think carefully about it, NVDA and other screen reader friends are sophisticated data processors. Their job is to gather needed information for blind people with help from facilities provided by the operating system + accessibility API's + apps + standards, process gathered information in a way suitable for presentation through multiple channels (speech, braille, sound, etc.), and presenting information to users. That's the core of screen readers, and when folks talk about screen reader development, we are talking about refining these elements (supporting newer accessibility standards, dealing with apps with no control labels, support for text-to-speech engines and braille displays, keeping an eye on operating system changes, etc.). Of course folks can customize screen readers to their liking (settings, code, add-ons, etc.). at the same time, app accessibility and usablity falls upon the responsibility of app developers, made better when they collaborate with users (this is why I always ask users to send feedback to developers to point out possible accessibility improvements).

Hope this helps a lot.

Cheers,

Joseph


Arnold Summers <arnoldsummerspbem@...>
 

Joseph,

Your message was helpful, particularly the mentions of API's and GUI toolkits. It gives me something concrete to look into in the way of actual code. You gave me a lot to think about. Thank you! 

On Sun, Jun 27, 2021 at 5:00 PM Joseph Lee <joseph.lee22590@...> wrote:

Hi all,

As Luke and others pointed out, there are lists dedicated to programming, and in case of NVDA specifically, there is NVDA Development list. I think both lists are good, but that won't solve the issue the original poster is looking into, and you can indeed create accessible GUI-based applications (although counterintuitive, NVDA is an accessible GUI-based application if you think about it). What I'm going to write comes from my experience as a programmer who spent years working on screen readers and have been advocating for accessibility and usability (like Luke, I'm blind, although I was a low vision user until my early teens, which was early 2000's):

As you may know, the first task of programming is looking for a problem to solve. The fact that you wish to write an app that is accessible is very notable, in that you may have found some issues you wish to solve by writing apps, along with looking at app design at the same time. So I'll assume you did your first task, so let's move onto design and accessibility aspects.

Accessibility is about designing products so it is approachable by different audiences (the task of actually using such products with help from assistive tools falls under "usability"). The question to ask when designing products with accessibility in mind is, "what are limitations and workarounds specific audiences need, and how can the product help bridge the gap that might be present for audiences?" For people with disabilities, the question falls under limitations of specific disabilities and tools that can expose your product functionality to specific audiences; for blind people, the obvious choice is using tools to help folks "see" screen content i.e. screen readers, magnifiers, color contrast, and so on; for deaf communities, using text to convey sounds, sign language output and what not. Then you would look for a way to make programs expose needed information so audiences (users) can use your product effectively, and one common scenario is using accessibility API's to communicate information to users of assistive technologies.

In GUI programming (something that's possible for blind people to do although with assistance if required), one would design data representation style (specific GUI controls for things such as text, forms, and many others). Although things may look colorful and intuitive for the majority (the term "majority" depends on language, country, and culture), without effort from humans and tools (along with mindset), the product would not be discoverable (wqord of mouth, review,s etc.), approachable (promotion, demos, etc.), and accessible (these three things must work together when accessibility is concerned, because people with disabilities are some of the most neglected communities when it comes to access to information (what I would term "information blackout"), although that is changing).

So to enhance how the product is seen by people with disabilities and to make them accessible (and usable), API's such as Microsoft Active Accessibility, UI Automation, IAccessible2 were created to help programmers design products with accessibility and usability in mind. These API's consist of at least three parts:

  1. Client/consumer: an assistive technology such as screen readers (including NVDA) is an accessibility client. The job of a client is to ask accessibility API's for information about a control a user is working on, and to perform specific actions required to help people use specific applications such as reading state changes.
  2. Server/producer: the application in question is a server because it serves clients by exposing crucial information for use by different assistive technologies. For screen readers, this means using text labels for graphical buttons, using facilities such as accessibility events to communicate activities such as screen content changes. How such info is communicated to users is the job of the client (assistive technology), and it is up to users as to what to do with information coming from the app.
  3. Accessibility bridge: API's such as MSAA and UIA serve as a bridge between servers (apps) and clients (assistive technologies). The job of accessibility API bridges is to serve as a "middle man" between users and apps by exposing server-side information (whatever the app says) in a way clients can understand, process, and present to users. At the same time, bridges accept interaction tasks (such as keyboard input) from users, communicates these facts to applications, and see what the app says.

A basic grasp of accessibility concepts is one of the steps involved in improving app accessibility (the first obvious step is understanding the culture the target audience comes from, a task you have accomplished well based on the original post). The next task is actually using assistive technologies and apps to better understand what folks are talking about. After that, it comes down to designing programs in a way that is accessible for diverse audiences such as adding labels for GUI controls and using accessibility API's to expose needed information (if using GUI toolkits, I recommend using ones known to have high accessibility marks such as wxWidgets and more recent versions of QT and WinUI/XAML). And don't forget to test your ideas with target audiences (testing, gathering feedback, etc.) early because it is more costly to "improve" accessibility later.

Before I close, one thing you may wish to ponder: if you think carefully about it, NVDA and other screen reader friends are sophisticated data processors. Their job is to gather needed information for blind people with help from facilities provided by the operating system + accessibility API's + apps + standards, process gathered information in a way suitable for presentation through multiple channels (speech, braille, sound, etc.), and presenting information to users. That's the core of screen readers, and when folks talk about screen reader development, we are talking about refining these elements (supporting newer accessibility standards, dealing with apps with no control labels, support for text-to-speech engines and braille displays, keeping an eye on operating system changes, etc.). Of course folks can customize screen readers to their liking (settings, code, add-ons, etc.). at the same time, app accessibility and usablity falls upon the responsibility of app developers, made better when they collaborate with users (this is why I always ask users to send feedback to developers to point out possible accessibility improvements).

Hope this helps a lot.

Cheers,

Joseph


Sam Bushman
 

Hey Arnold,

 

I didn’t write before because others on this list are much more qualified than I am.

However, since others didn’t focus on things I would mention I decided maybe my input would help as well.

 

When you write programs for the blind The following is way helpful:

 

If you use standard windows controls instead of custom controls screen readers have a much better ability to work well.

If you provide several ways to accomplish things in the app it’s much more helpful.

Meaning keyboard access to everything not just mouse access.

If you use standard tool tips for help.

If you make sure the tab key works well everywhere it’s much more helpful. Focusing on tab order in this case makes a huge difference.

Making sure to use text labels for things when graphics are used it’s helpful.

Meaning edit boxes with labels and other controls as well.

Making sure screens have text not just graphics is huge.

The more standard your windows screens are the easier it will be for us to use.

 

Some programmers actually have a setting in there software making much better access possible.

A great example of this is the Jarte application – it’s a simple word processor.

 

I could say much more but these ideas should get you started.

 

Thanks for thinking of us.

 

The point was made but I agree, we can test and help if you like also.

 

Sam

 

From: nvda@nvda.groups.io <nvda@nvda.groups.io> On Behalf Of Arnold Summers
Sent: Sunday, June 27, 2021 5:55 PM
To: nvda@nvda.groups.io
Subject: Re: [nvda] Amateur programmer, looking to create accessible programs

 

Joseph,

 

Your message was helpful, particularly the mentions of API's and GUI toolkits. It gives me something concrete to look into in the way of actual code. You gave me a lot to think about. Thank you! 

 

On Sun, Jun 27, 2021 at 5:00 PM Joseph Lee <joseph.lee22590@...> wrote:

Hi all,

As Luke and others pointed out, there are lists dedicated to programming, and in case of NVDA specifically, there is NVDA Development list. I think both lists are good, but that won't solve the issue the original poster is looking into, and you can indeed create accessible GUI-based applications (although counterintuitive, NVDA is an accessible GUI-based application if you think about it). What I'm going to write comes from my experience as a programmer who spent years working on screen readers and have been advocating for accessibility and usability (like Luke, I'm blind, although I was a low vision user until my early teens, which was early 2000's):

As you may know, the first task of programming is looking for a problem to solve. The fact that you wish to write an app that is accessible is very notable, in that you may have found some issues you wish to solve by writing apps, along with looking at app design at the same time. So I'll assume you did your first task, so let's move onto design and accessibility aspects.

Accessibility is about designing products so it is approachable by different audiences (the task of actually using such products with help from assistive tools falls under "usability"). The question to ask when designing products with accessibility in mind is, "what are limitations and workarounds specific audiences need, and how can the product help bridge the gap that might be present for audiences?" For people with disabilities, the question falls under limitations of specific disabilities and tools that can expose your product functionality to specific audiences; for blind people, the obvious choice is using tools to help folks "see" screen content i.e. screen readers, magnifiers, color contrast, and so on; for deaf communities, using text to convey sounds, sign language output and what not. Then you would look for a way to make programs expose needed information so audiences (users) can use your product effectively, and one common scenario is using accessibility API's to communicate information to users of assistive technologies.

In GUI programming (something that's possible for blind people to do although with assistance if required), one would design data representation style (specific GUI controls for things such as text, forms, and many others). Although things may look colorful and intuitive for the majority (the term "majority" depends on language, country, and culture), without effort from humans and tools (along with mindset), the product would not be discoverable (wqord of mouth, review,s etc.), approachable (promotion, demos, etc.), and accessible (these three things must work together when accessibility is concerned, because people with disabilities are some of the most neglected communities when it comes to access to information (what I would term "information blackout"), although that is changing).

So to enhance how the product is seen by people with disabilities and to make them accessible (and usable), API's such as Microsoft Active Accessibility, UI Automation, IAccessible2 were created to help programmers design products with accessibility and usability in mind. These API's consist of at least three parts:

  1. Client/consumer: an assistive technology such as screen readers (including NVDA) is an accessibility client. The job of a client is to ask accessibility API's for information about a control a user is working on, and to perform specific actions required to help people use specific applications such as reading state changes.
  2. Server/producer: the application in question is a server because it serves clients by exposing crucial information for use by different assistive technologies. For screen readers, this means using text labels for graphical buttons, using facilities such as accessibility events to communicate activities such as screen content changes. How such info is communicated to users is the job of the client (assistive technology), and it is up to users as to what to do with information coming from the app.
  3. Accessibility bridge: API's such as MSAA and UIA serve as a bridge between servers (apps) and clients (assistive technologies). The job of accessibility API bridges is to serve as a "middle man" between users and apps by exposing server-side information (whatever the app says) in a way clients can understand, process, and present to users. At the same time, bridges accept interaction tasks (such as keyboard input) from users, communicates these facts to applications, and see what the app says.

A basic grasp of accessibility concepts is one of the steps involved in improving app accessibility (the first obvious step is understanding the culture the target audience comes from, a task you have accomplished well based on the original post). The next task is actually using assistive technologies and apps to better understand what folks are talking about. After that, it comes down to designing programs in a way that is accessible for diverse audiences such as adding labels for GUI controls and using accessibility API's to expose needed information (if using GUI toolkits, I recommend using ones known to have high accessibility marks such as wxWidgets and more recent versions of QT and WinUI/XAML). And don't forget to test your ideas with target audiences (testing, gathering feedback, etc.) early because it is more costly to "improve" accessibility later.

Before I close, one thing you may wish to ponder: if you think carefully about it, NVDA and other screen reader friends are sophisticated data processors. Their job is to gather needed information for blind people with help from facilities provided by the operating system + accessibility API's + apps + standards, process gathered information in a way suitable for presentation through multiple channels (speech, braille, sound, etc.), and presenting information to users. That's the core of screen readers, and when folks talk about screen reader development, we are talking about refining these elements (supporting newer accessibility standards, dealing with apps with no control labels, support for text-to-speech engines and braille displays, keeping an eye on operating system changes, etc.). Of course folks can customize screen readers to their liking (settings, code, add-ons, etc.). at the same time, app accessibility and usablity falls upon the responsibility of app developers, made better when they collaborate with users (this is why I always ask users to send feedback to developers to point out possible accessibility improvements).

Hope this helps a lot.

Cheers,

Joseph


Arnold Summers <arnoldsummerspbem@...>
 

Hi Sam,

Sorry I didn't get back to you sooner. These are all great tips, and something I will definitely keep in mind. And you said you could say much more: Please do so! The more information I have to work with the better!

Arnold
 

On Sun, Jun 27, 2021 at 7:41 PM Sam Bushman <libertyroundtable@...> wrote:

Hey Arnold,

 

I didn’t write before because others on this list are much more qualified than I am.

However, since others didn’t focus on things I would mention I decided maybe my input would help as well.

 

When you write programs for the blind The following is way helpful:

 

If you use standard windows controls instead of custom controls screen readers have a much better ability to work well.

If you provide several ways to accomplish things in the app it’s much more helpful.

Meaning keyboard access to everything not just mouse access.

If you use standard tool tips for help.

If you make sure the tab key works well everywhere it’s much more helpful. Focusing on tab order in this case makes a huge difference.

Making sure to use text labels for things when graphics are used it’s helpful.

Meaning edit boxes with labels and other controls as well.

Making sure screens have text not just graphics is huge.

The more standard your windows screens are the easier it will be for us to use.

 

Some programmers actually have a setting in there software making much better access possible.

A great example of this is the Jarte application – it’s a simple word processor.

 

I could say much more but these ideas should get you started.

 

Thanks for thinking of us.

 

The point was made but I agree, we can test and help if you like also.

 

Sam

 

From: nvda@nvda.groups.io <nvda@nvda.groups.io> On Behalf Of Arnold Summers
Sent: Sunday, June 27, 2021 5:55 PM
To: nvda@nvda.groups.io
Subject: Re: [nvda] Amateur programmer, looking to create accessible programs

 

Joseph,

 

Your message was helpful, particularly the mentions of API's and GUI toolkits. It gives me something concrete to look into in the way of actual code. You gave me a lot to think about. Thank you! 

 

On Sun, Jun 27, 2021 at 5:00 PM Joseph Lee <joseph.lee22590@...> wrote:

Hi all,

As Luke and others pointed out, there are lists dedicated to programming, and in case of NVDA specifically, there is NVDA Development list. I think both lists are good, but that won't solve the issue the original poster is looking into, and you can indeed create accessible GUI-based applications (although counterintuitive, NVDA is an accessible GUI-based application if you think about it). What I'm going to write comes from my experience as a programmer who spent years working on screen readers and have been advocating for accessibility and usability (like Luke, I'm blind, although I was a low vision user until my early teens, which was early 2000's):

As you may know, the first task of programming is looking for a problem to solve. The fact that you wish to write an app that is accessible is very notable, in that you may have found some issues you wish to solve by writing apps, along with looking at app design at the same time. So I'll assume you did your first task, so let's move onto design and accessibility aspects.

Accessibility is about designing products so it is approachable by different audiences (the task of actually using such products with help from assistive tools falls under "usability"). The question to ask when designing products with accessibility in mind is, "what are limitations and workarounds specific audiences need, and how can the product help bridge the gap that might be present for audiences?" For people with disabilities, the question falls under limitations of specific disabilities and tools that can expose your product functionality to specific audiences; for blind people, the obvious choice is using tools to help folks "see" screen content i.e. screen readers, magnifiers, color contrast, and so on; for deaf communities, using text to convey sounds, sign language output and what not. Then you would look for a way to make programs expose needed information so audiences (users) can use your product effectively, and one common scenario is using accessibility API's to communicate information to users of assistive technologies.

In GUI programming (something that's possible for blind people to do although with assistance if required), one would design data representation style (specific GUI controls for things such as text, forms, and many others). Although things may look colorful and intuitive for the majority (the term "majority" depends on language, country, and culture), without effort from humans and tools (along with mindset), the product would not be discoverable (wqord of mouth, review,s etc.), approachable (promotion, demos, etc.), and accessible (these three things must work together when accessibility is concerned, because people with disabilities are some of the most neglected communities when it comes to access to information (what I would term "information blackout"), although that is changing).

So to enhance how the product is seen by people with disabilities and to make them accessible (and usable), API's such as Microsoft Active Accessibility, UI Automation, IAccessible2 were created to help programmers design products with accessibility and usability in mind. These API's consist of at least three parts:

  1. Client/consumer: an assistive technology such as screen readers (including NVDA) is an accessibility client. The job of a client is to ask accessibility API's for information about a control a user is working on, and to perform specific actions required to help people use specific applications such as reading state changes.
  2. Server/producer: the application in question is a server because it serves clients by exposing crucial information for use by different assistive technologies. For screen readers, this means using text labels for graphical buttons, using facilities such as accessibility events to communicate activities such as screen content changes. How such info is communicated to users is the job of the client (assistive technology), and it is up to users as to what to do with information coming from the app.
  3. Accessibility bridge: API's such as MSAA and UIA serve as a bridge between servers (apps) and clients (assistive technologies). The job of accessibility API bridges is to serve as a "middle man" between users and apps by exposing server-side information (whatever the app says) in a way clients can understand, process, and present to users. At the same time, bridges accept interaction tasks (such as keyboard input) from users, communicates these facts to applications, and see what the app says.

A basic grasp of accessibility concepts is one of the steps involved in improving app accessibility (the first obvious step is understanding the culture the target audience comes from, a task you have accomplished well based on the original post). The next task is actually using assistive technologies and apps to better understand what folks are talking about. After that, it comes down to designing programs in a way that is accessible for diverse audiences such as adding labels for GUI controls and using accessibility API's to expose needed information (if using GUI toolkits, I recommend using ones known to have high accessibility marks such as wxWidgets and more recent versions of QT and WinUI/XAML). And don't forget to test your ideas with target audiences (testing, gathering feedback, etc.) early because it is more costly to "improve" accessibility later.

Before I close, one thing you may wish to ponder: if you think carefully about it, NVDA and other screen reader friends are sophisticated data processors. Their job is to gather needed information for blind people with help from facilities provided by the operating system + accessibility API's + apps + standards, process gathered information in a way suitable for presentation through multiple channels (speech, braille, sound, etc.), and presenting information to users. That's the core of screen readers, and when folks talk about screen reader development, we are talking about refining these elements (supporting newer accessibility standards, dealing with apps with no control labels, support for text-to-speech engines and braille displays, keeping an eye on operating system changes, etc.). Of course folks can customize screen readers to their liking (settings, code, add-ons, etc.). at the same time, app accessibility and usablity falls upon the responsibility of app developers, made better when they collaborate with users (this is why I always ask users to send feedback to developers to point out possible accessibility improvements).

Hope this helps a lot.

Cheers,

Joseph