Note: groups.io will be down for maintenance on Monday, September 26th, starting at 9AM Pacific Time (4PM Monday September 26, 2022 UTC), for approximately one hour.
I intentionally stayed out of this thread as I find the whole discussion interesting. While I do have reservations about the actual article from The Verge, I think the article itself as accurate at least from the perspective of folks who are new to screen readers. As discussed throughout this thread, the history of screen readers (or for that matter, history of assistive technologies) is far more complex than what the article said, as talking about different software and hardware solutions can fill a small encyclopedia.
I will talk about my perspective on purposes of screen readers in a later post, but I'm joining this conversation to pose a question for us to consider, specificlaly in light of different input strategies:
if you read what's new document for NVDA 2022.2, one of the bug fixes is:
In Windows 11, it is again possible to navigate and interact with user interface elements, such as Taskbar and Task View using mouse and touch interaction.
Just think about that for a second.