Re: NVDA Navigation Using Touch Gestures and Setting Up Custom Gestures



In a way, it might be possible to do this (I cannot guarantee that Enhanced Touch Gestures will include this feature in the end, as the future of that add-on is a bit uncertain at the moment). I say “might” because:

Touch navigation is intimately tied to object navigation. Partly because in mobile operating systems such as iOS and Android, when you move between controls with touch gestures, all sorts of controls will be visited. Of course there are some areas that you can’t really navigate with flicks alone – you do need to move your finger to iOS’s status bar to interact with items up there to some degree.

On Windows, touch gestures didn’t really take off until Windows 8. Despite attempts to make the touch interface mimic that of mobile operating systems, Microsoft realized that emphasizing touch interface alone didn’t work. In the end, the biggest lesson Microsoft learned is that many people prefer keyboard and mouse to manipulate things on screen, the biggest evidence being hiding tablet mode on Windows 10 Version 20H2 (October 2020 Update) if a system doesn’t support touch or does not include touch-capable hardware.

The first Windows screen reader to take advantage of touch gestures was Narrator, and NVDA followed soon after, borrowing heavily from VoiceOver and concepts used there; in fact, touch support was one of the things that attracted me to NVDA community. Despite introduction of touch gestures and efforts to improve it (that’s really what Enhanced Touch Gestures add-on is all about), NVDA people (users and developers) knew that keyboard was the primary input device. Back when it was first designed (and is still the case in 2020), because NVDA’s touch support borrows heavily from VoiceOver, touch is used mostly for reviewing screen content, not as a primary interaction tool. Also, because you can indeed separate review cursor and caret tracking, or for that matter, you can review up to two controls at once (system focus and another control through navigator object), and since touch interaction is affected by where the navigator object is located, you can’t really generalize touch commands to Windows commands easily, especially when you consider that you might be editing a document while you are exploring Taskbar icons via touch (whereas on smartphones, all you see is what you get to explore, on Windows and other desktop level operating systems, you’ll see many things at once).

Regarding my statement in the beginning (uncertainty of Enhanced Touch Gestures’ future), I can’t comment on specifics at this time, but suffice to say that, around this time next year, I might be asking around for a new maintainer of that add-on. I can maintain Enhanced Touch Gestures and do plan to use touch devices for a while, but life goes on, so I need to move on.





From: <> On Behalf Of Khalid Anwar
Sent: Thursday, November 26, 2020 10:33 PM
Subject: Re: [nvda] NVDA Navigation Using Touch Gestures and Setting Up Custom Gestures


 I think, and I don't know how practical this would be in real terms, that a text selection mode could be added so that text can be highlighted by character, word, line, et cetera using the standard flick left and write gestures, and then mapping touch gestures onto existing  Windows  keyboard gestures for copy and paste control C and control V. 
 I think the line between smart phones, tablets, laptops, notebooks and PCs is blurring almost every year, and so most  programmes  will be able to support some form of touch support in the near future  though how well they do it is another matter entirely. 
 I also wonder how many of these screen reader gestures that we use have been copyrighted or trademarked by iOS or android, how long is it before they decide that  a particular gesture to perform a particular function can only be used on their devices and anyone else who does this should be punished. 

Join to automatically receive all group messages.