Re: Explainer: accessible drag and drop notifications from the perspective of accessibility API's
Ah yes, I remember... Essentially, information blackout, or rather, intentional blackout and intentional foci (plural of "focus"). In some ways, I'm rediscovering the "academic" side of this thread this semester (critical theory; won't go into details here).
For folks new to the forum: the last few messages between I and Brian could be best described as defining the term "information blackout" or "information tuning" - a concept where intentionally or not, computer users focus on a specific information, the immediate context being how screen reader users (or for that matter, assistive tech users) are (sometimes forced to) focus on a specific part of the whole information stream (screen content) and think that's all there is to it. I use the term "blackout" intentionally - it might be possible that a more important piece of information is present but not interacted with or intentionally (or unintentionally) hidden or obscured. You can call it "filtering," but filtering implies that you have an awareness of other pieces of screen content present and you are willing to dive into specifics - I am talking about the tendency of screen reader users to think whatever they are focused on is so important to notice that a much important message has appeared somewhere. For sighted folks (which is the mainstream, something I think we must admit even though folks may wish to deny it), the information is apparent as they can scan the screen content; not easily so for blind people as they must rely on alternatives. In application to the current context (drag and drop), there are places where accessible (keyboard-based and other medium) drag and drop is possible with constraints as it is really up to the control in question, software developers, and mindset of screen reader vendors to "restore or reformat information". And as Brian stated, there are simply places where there will be "perpetual information blackout" i.e. remain inaccessible and unusable by specific audiences, or a "starry night" where there is faint hope or a wish unless the larger public (app developers, to start with) realize the effect of such a blackout and offer alternatives (after all, the term "assistive technology" implies use of alternatives to access something while taking constraints into account).
To sum up and to bring up something I implied in my first post in this thread (as a follow-up): part of the reason for posting an explainer on accessible drag and drop has to do with Windows App Essentials add-on. Back when Windows 10 first came out (2015), an NVDA user asked if it is possible to announce text that Narrator says when rearranging Start menu tiles. I didn't have an answer back then, but after looking up UIA events, I learned that it raises events when things are dragged on screen. I added detection routines for drag events in April 2018 as part of Windows App Essentials, but it will not be until late 2019 that I added drop target events support to let NVDA announce dragged item and its new position albeit by faking a focus event. Then in 2021 I posted a pull request to NVDA project about drag and drop, but a discussion arose regarding drop target announcement and if focus event was a suitable experience for users (in short, I brought add-on contents to NVDA Core itself). Then graduate school happened and I gave up on the initial pull request until about a month ago when I heard reports about ability to rearrange Taskbar icons in Windows 11 and how inaccessible this was for keyboard users. Then two weeks ago, Quinn (the amazing producer of NVDA YouTube videos) asked on Windows App essentials GitHub repo if I can work on drag and drop announcement for NVDA after informing me that it works after installing the add-on. After doing some digging into UI Automation documentation from Microsoft, I realized that I've been using the wrong solution for the last three years, and after some experimentation, finally found the solution that's more stable and resolves the question posed by an anonymous NVDA user all those years ago. This happened last week, hence my statement about "celebratory week," and I'm happy to report that this stable solution will be part of Windows App Essentials 22.10 (late September to early October) and parts are incorporated into NVDA as of 2022.4 alpha (I'm working on remaining pieces and posted pull requests last week); of course, if you are using Windows App Essentials development builds, the solution is already there (NVDA will indeed announce drag and drop effects (progress) but if and only if this is done from UIA controls and the control supports it (hence the constraint).
Hope this answers lots of questions (and a huge credit and thank you to this anonymous NVDA user and Quinn for making me realize that I and my source code were wrong for all these years).