Lino is correct: I am the one who introduced the search suggestions sound facility, although it was Jamie Teh (now at Mozilla) who produced these sounds. The search suggestions facility was originally part of Windows App Essentials add-on before it became a part of NVDA in 2017, and I consider this feature one of my accomplishments. Lino is also correct in that suggestion sounds idea came from Narrator, and unlike NVDA, I don't think suggestion sounds can be turned off in Narrator (last time I checked which was a few minutes ago).
To explain the choices I made back in 2017 which I abide by five years later, I feel a really deep overview of suggestion sound feature internals is in order. I hope this can shed light on what's going on so you can better understand why such and such decisions were made:
As some of you may know (or I should say everyone reading this message will know by now), I am one of the first Windows Insiders, joining the program the day after Microsoft announced Windows 10 (Windows 10 was announced on September 30, 2014, and I became a Windows Insider the following day and installed the first build offered (9841)). As such, one thing I constantly did (and still do to this day) is keep track of Narrator features with a conviction that Narrator is a gold standard when it comes to supporting modern Windows features; this is because Narrator attempts to read what UIA (UI Automation) says to it, unlike NVDA (and other screen readers) which come with some workarounds.
One thing that fascinated me from those days, which was confirmed by reports from Windows Insiders, is how Narrator can announce suggestions. This was made more fascinating when I observed that Narrator makes a sound when suggestions appear and disappear. Although it won't be until late 2015 that I will introduce Windows App Essentials add-on, there were suggestions to let NVDA announce search suggestions. So I decided to investigate what's up and learned a few things while designing the feature Gene and others are talking about.
The origin of what we would call "search suggestions" was introduced in Windows Vista (I seem to recall some of you listening to a set of tutorials I made on Windows Vista back when I was a high school student). As folks pointed out, when you search things in Start menu, it will return suggestions. This was enhanced in subsequent Windows releases, with user interface changes in Windows 8.x and the appearance of Start menu as we know it in Windows 10. While the underlying mechanisms differ, the different iteration of Start menu from Windows Vista onwards carry a search box and results list as standard.
It won't be until Windows 8.x that Start menu (back then, it was Start screen) would gain the necessary foundations to make suggestions announcement more accessible. To accomplish suggestions announcement, two UIA events and a concept of a controller for element array is used. A controller for element is a UI element that depends on actions from another element. For example, when you search things in Start menu, the resulting suggestions list is the controller for element of the search field. Thus when search terms change, so does the suggestions list. Internally, UIA returns an array of controller for elements, and in our case, this array (list) contains the suggestions list itself. The appearance of controller for element is indicated by a UIA controller for event, which is the actual mechanism that allows NVDA to play suggestions sound when configured to do so.
UIA controller for event is one of two events responsible for announcing suggestions. The other event is layout invalidated, which is used when contents of a list changes. This is used in search results because top suggestions can change whenever you enter different terms into the search field. This event was introduced in Windows 8 precisely to let Narrator and other screen readers announce search suggestions (termed "auto-suggest accessibility" by Microsoft) and is properly utilized from Windows 10 onwards.
When you search for things from start menu, the following takes place:
In other places, NVDA does not announce top suggestions. For these, UIA layout invalidated event is employed so NVDA can announce how many suggestions are shown. For example, in Settings app (Windows+I), NVDA waill announce suggestions count when you search settings. Controller for event is still employed, and that's why you will hear suggestion sounds.
After learning about what's going on, I implemented an initial version of suggestions announcement in Windows App Essentials add-on in 2016. Back then I utilized UIA controller for event and used a different sound. It won't be until 2017 that the sounds you hear were made by Jamie, and this, together with suggestions announcement feature, became part of NVDA. The NVDA side of the story is as follows:
NVDA knows how to detect search fields and suggestions list from modern apps and Start menu. NVDA's definition of a search field is that of an edit field with suggestions. Two internal events are used by search fields: suggestions opened and suggestions closed. Apart from playing different sounds and presenting different messages on a braile display, they are identical. Suggestions opened event is used when the search field notices controller for elements array is not empty, and suggestions closed event is fired otherwise. Because I knew that some people would prefer not to hear suggestion sounds, a setting was added in object presentation settings (see below) to configure the setting folks are talking about.
In answer to the question as to why object presentation settings, this is due to the nature of this setting. Object presentation settings are used to deal with what happens when NVDA announces things such as focused control, navigator object/navigation, progress bars, shortcut keys exposed through accessibility API's, notifications, among other settings. Because search suggestions communicate changes to controls, I figured putting a toggle inside object presentation made more sense. Think about it carefully: do you interpret search suggestion sounds as just earcons, or does it communicate something more substantial? I take the latter position because suggestion sounds indicate changes to on-screen controls, made richer thanks to information coming from accessibility API (UIA).
In summary, what may appear to be annoying is not really so in reality. Search suggestion sounds is one of those that demonstrate the need for multimodality in software design. Some would argue that NVDA should be more flexible to cover other modality issues, but we need to keep the context in mind: NVDA is a screen reader, a sophisticated program that processes screen information and presents limited information in a way that communicates what is going on, or rather, gives hints to users as to what is happening on screen. Because NVDA must operate within the constraints of information blackout, it needs to communicate limited screen information in a way that allows users to glimpse the most important information for the moment. For search suggestions, an effective way is sounds and telling users how many suggestions are available, or if supported, tells you what the top suggestion on the screen is.
Hope this helps.