I've touched Window Eyes more recently than 25 years ago, but not enough to have any vivid memory of the feature Mr. Nutt is alluding to. Mouse tracking in NVDA will read whatever's on the screen that is under the mouse as it moves. I've taught a lot of people (yes, blind people) how to use the mouse quite effectively to do "quick and dirty" review of a completely unfamiliar screen. It's not perfect or 100% systematic because screen layouts can differ quite a bit, but you can learn quite a bit about what's on an entirely unfamiliar screen just via carefully "waving the mouse pointer slowly over it." And mouse tracking is on in NVDA by default, unless somethings changed very recently. --
Brian - Virginia, USA - Windows 11 Pro, 64-Bit, Version 22H2, Build 22621; Office 2016, Version 16.0.15726.20188, 32-bit
Any clod can have the facts, but having an opinion is an art.
~ Charles McCabe, San Francisco Chronicle
|
|
I meant the Window-eyes mouse pointer allowed you to review the
screen while moving the mouse.
The WE cursor allowed you to review the screen without moving the
mouse. This was similar to the JAWS invisible cursor.
Gene
On 4/25/2023 3:58 PM, Gene via
groups.io wrote:
toggle quoted message
Show quoted text
Window-eyes had two review modes, the WE Cursor and the mouse
pointer. But the Mouse pointer was, in essence, screen review
that moved the mouse as you moved around the screen. It was
similar to the JAWS cursor and, Window-eyes had the ability to
move by clip. I don't remember just what that was and I don't
recall if you might see something way you wouldn't in ordinary
movement but my recollection is that it was largely a way to move
faster through the screen. The JAWS cursor and the Mouse Pointer
would both see the same or very similar things.
Which people liked to use was largely a matter of preference.
Window-eyes screen review allowed you to use the numpad and had
its own commands while in JAWS, the JAWS cursor used the same
commands as you would use in an editor for movement, control home,
control end, etc.
Gene
On 4/25/2023 3:51 PM, Luke Davis
wrote:
Steve
Nutt via groups.io wrote:
Yes, Object Nav doesn’t even cut it for
the mouse like review that Window-Eyes offered.
Steve, do you know of any recorded demos of this? Either audio
or video with audio? I haven't touched WinEyes in about 25
years, and don't remember this feature.
Luke
|
|
Window-eyes had two review modes, the WE Cursor and the mouse
pointer. But the Mouse pointer was, in essence, screen review that
moved the mouse as you moved around the screen. It was similar to
the JAWS cursor and, Window-eyes had the ability to move by clip. I
don't remember just what that was and I don't recall if you might
see something way you wouldn't in ordinary movement but my
recollection is that it was largely a way to move faster through the
screen. The JAWS cursor and the Mouse Pointer would both see the
same or very similar things.
Which people liked to use was largely a matter of preference.
Window-eyes screen review allowed you to use the numpad and had its
own commands while in JAWS, the JAWS cursor used the same commands
as you would use in an editor for movement, control home, control
end, etc.
Gene
On 4/25/2023 3:51 PM, Luke Davis wrote:
toggle quoted message
Show quoted text
Steve
Nutt via groups.io wrote:
Yes, Object Nav doesn’t even cut it for
the mouse like review that Window-Eyes offered.
Steve, do you know of any recorded demos of this? Either audio or
video with audio? I haven't touched WinEyes in about 25 years, and
don't remember this feature.
Luke
|
|
Steve Nutt via groups.io wrote: Yes, Object Nav doesn’t even cut it for the mouse like review that Window-Eyes offered. Steve, do you know of any recorded demos of this? Either audio or video with audio? I haven't touched WinEyes in about 25 years, and don't remember this feature. Luke
|
|
The JAWS cursor does route the mouse. The command for routing is
route JAWS cursor to PC cursor. You are moving the mouse.
JAWS has another review mode that just let's you review the screen
without moving the mouse. It is called Invisible Cursor. You
activate it by pressing the JAWS cursor key rapidly twice.
Gene
On 4/25/2023 2:59 PM, Steve Nutt via
groups.io wrote:
toggle quoted message
Show quoted text
Hi,
The problem though is that the JAWS
cursor doesn’t move the mouse. So you still end up having to
route the mouse, one extra keystroke, before you can click.
The invisible cursor can be tethered
to the mouse though I believe, so that’s probably the
closest we’ll get.
All the best
Steve
I can't tell you exactly what Window-eyes
offered compared with other screen-readers but I am not
convinced that some, perhaps a good deal of your opinion is
the result of not knowing some more obscure features in JAWS.
There is a command to move the JAWS cursor in very small
increments. I haven't used it for perhaps over a decade and I
learned about it a long time ago. And of course, you can move
by line and word and carachter using the JAWS cursor.
You may prefer the Window-eyes mouse being separate and using
separate commands. I much prefer learning one set of
navigation commands that work for the pc cursor and the
mouse. Once I know one set of commands, I know both.
The NVDA cursor may not be able to be moved by pixel, but you
can move it by line, word and character or to the top and
bottom of the screen. If you know screen review commands and
object review commands in NVDA, you can move the mouse to the
position you set screen review or the object navigator.
My point is not to just discuss Window-eyes compared with
other screen-readers. I'm writing because if people believe
something is limited, incorrectly, they won't look for how to
do things that can be done. I think the matter should be
further discussed.
Gene
On 1/10/2023 4:37 AM, Steve Nutt wrote:
Hi Joseph,
I have never had sight, but I have
always loved Window-Eyes ability to let you move the mouse
pointer on the screen either to the PC, or by lines,
words, chars, or even pixels.
No other screen reader to my knowledge
has ever achieved that.
Not only that, with Window-Eyes, the
mouse was always “live” so you didn’t have to go into a
special mode to read around it.
Oh yes, there was a lot that
Window-Eyes had that no screen reader has touched yet.
All the best
Steve
Hi,
Mouse and touch interaction and navigation are always in
our radar i.e. there were issues on GitHub in the past,
and I did an all-nighter bac in May to bring improved
mouse and touch interaction support for parts of Windows
11 user interface (it was both fun and a frustrating
experience, to tell you the truth). At the moment our
primary input scenario is the keyboard, followed by
braille display hardware. But I do recognize the potential
of mouse and touch interaction and have been a vocal
advocate for improving the user experience for touchscreen
users (me included). My enthusiasm for touchscreens became
the reason for creating Enhanced Touch Gestures add-on
back in 2013 (and I've been buying touch-capable laptops
since), and the potential of mouse interaction was perhaps
the biggest reason for maintaining Golden Cursor for some
time (I'm no longer maintaining Golden Cursor, by the
way). In fact, the upcoming version of Windows App
Essentials add-on will include support for improved mouse
and touch interaction scenarios in an upcoming change to
Windows 11 Version 22H2.
Another possible reason for my enthusiasm for
touchscreens (and mouse interaction to some extent) is
that I had low vision until I lost my sight in my teens.
Perhaps this experience helped me think in terms of
spatial dimensions, drawing me to screen exploration
techniques and helping me adjust easily to touch-enabled
devices. I joined NVDA project after hearing a demo of
NVDA running on a Samsung tablet with a preview build of
Windows 8 installed in 2012, and like many, I was
impressed with touch support on iOS and Android.
Cheers,
Joseph
|
|
I really doubt it. What you are calling mouse review is the same as
screen-review except that you are moving the mouse. Object
navigation shows you information at times that screen-review
couldn't. Years ago, when Spotify wasn't at all accessible, the
only way to use it in NVDA was with object navigation.
And these days, because of technical changes in Windows,
screen-review doesn't work in a lot of instances. It wouldn't work
in Window-eyes either. Window-eyes had an interesting way of moving
around, I don't recall its name but you could move not through every
item on screen if you didn't want to, but you could skip to what
I'll call main items, though that isn't what it was called.
Gene
On 4/25/2023 2:55 PM, Steve Nutt via
groups.io wrote:
toggle quoted message
Show quoted text
Sorry for coming back to this so
late, I haven’t looked at the list for a while.
Yes, Object Nav doesn’t even cut it
for the mouse like review that Window-Eyes offered.
Even flat screen review doesn’t work
nearly so well unfortunately.
And no, JAWS isn’t there with proper
mouse like review either. I miss Window-Eyes for its almost
literal screen review mode.
All the best
Steve
Hi Steve
I use the GW micro vocalize and window
eyes screen readers for many years and the accessibility
provided by the mouse was very helpful to me!
When I started using nvda I began to
discover issues with mouse support and started reporting
these and learned that this was not a high priority
I still depend on these features
everyday and continue to report related issues and
recently realize that many of the new issues were coming
from Firefox nightly and I'm trying to help identify and
report those problems before they become issues for the
stable Firefox browser.
Fortunately the developers at Mozilla
are making this a priority and are working hard to resolve
and collaborate.
Since using the Google TalkBack screen
reader on Android and the explore by touch feature I now
like to call this explore by mouse.
I really like to have the ability to
randomly access the content on the screen using the mouse
cursor.
I have never had the patience to access
all the content on the screen that I wasn't interested in
Reading
On Tue, Jan 10, 2023, 5:37 AM Steve
Nutt <steve@...>
wrote:
Hi Joseph,
I have never had sight, but I have
always loved Window-Eyes ability to let you move the
mouse pointer on the screen either to the PC, or by
lines, words, chars, or even pixels.
No other screen reader to my
knowledge has ever achieved that.
Not only that, with Window-Eyes,
the mouse was always “live” so you didn’t have to go
into a special mode to read around it.
Oh yes, there was a lot that
Window-Eyes had that no screen reader has touched yet.
All the best
Steve
Hi,
Mouse and touch interaction and navigation are always
in our radar i.e. there were issues on GitHub in the
past, and I did an all-nighter bac in May to bring
improved mouse and touch interaction support for parts
of Windows 11 user interface (it was both fun and a
frustrating experience, to tell you the truth). At the
moment our primary input scenario is the keyboard,
followed by braille display hardware. But I do
recognize the potential of mouse and touch interaction
and have been a vocal advocate for improving the user
experience for touchscreen users (me included). My
enthusiasm for touchscreens became the reason for
creating Enhanced Touch Gestures add-on back in 2013
(and I've been buying touch-capable laptops since),
and the potential of mouse interaction was perhaps the
biggest reason for maintaining Golden Cursor for some
time (I'm no longer maintaining Golden Cursor, by the
way). In fact, the upcoming version of Windows App
Essentials add-on will include support for improved
mouse and touch interaction scenarios in an upcoming
change to Windows 11 Version 22H2.
Another possible reason for my enthusiasm for
touchscreens (and mouse interaction to some extent) is
that I had low vision until I lost my sight in my
teens. Perhaps this experience helped me think in
terms of spatial dimensions, drawing me to screen
exploration techniques and helping me adjust easily to
touch-enabled devices. I joined NVDA project after
hearing a demo of NVDA running on a Samsung tablet
with a preview build of Windows 8 installed in 2012,
and like many, I was impressed with touch support on
iOS and Android.
Cheers,
Joseph
|
|
Hi, The problem though is that the JAWS cursor doesn’t move the mouse. So you still end up having to route the mouse, one extra keystroke, before you can click. The invisible cursor can be tethered to the mouse though I believe, so that’s probably the closest we’ll get. All the best Steve
toggle quoted message
Show quoted text
From: nvda@nvda.groups.io <nvda@nvda.groups.io> On Behalf Of Gene Sent: Tuesday, January 10, 2023 3:47 PM To: nvda@nvda.groups.io Subject: Re: [nvda] Does NVDA read a webpage or a word document ? I can't tell you exactly what Window-eyes offered compared with other screen-readers but I am not convinced that some, perhaps a good deal of your opinion is the result of not knowing some more obscure features in JAWS. There is a command to move the JAWS cursor in very small increments. I haven't used it for perhaps over a decade and I learned about it a long time ago. And of course, you can move by line and word and carachter using the JAWS cursor.
You may prefer the Window-eyes mouse being separate and using separate commands. I much prefer learning one set of navigation commands that work for the pc cursor and the mouse. Once I know one set of commands, I know both.
The NVDA cursor may not be able to be moved by pixel, but you can move it by line, word and character or to the top and bottom of the screen. If you know screen review commands and object review commands in NVDA, you can move the mouse to the position you set screen review or the object navigator.
My point is not to just discuss Window-eyes compared with other screen-readers. I'm writing because if people believe something is limited, incorrectly, they won't look for how to do things that can be done. I think the matter should be further discussed.
Gene On 1/10/2023 4:37 AM, Steve Nutt wrote: Hi Joseph, I have never had sight, but I have always loved Window-Eyes ability to let you move the mouse pointer on the screen either to the PC, or by lines, words, chars, or even pixels. No other screen reader to my knowledge has ever achieved that. Not only that, with Window-Eyes, the mouse was always “live” so you didn’t have to go into a special mode to read around it. Oh yes, there was a lot that Window-Eyes had that no screen reader has touched yet. All the best Steve
Hi, Mouse and touch interaction and navigation are always in our radar i.e. there were issues on GitHub in the past, and I did an all-nighter bac in May to bring improved mouse and touch interaction support for parts of Windows 11 user interface (it was both fun and a frustrating experience, to tell you the truth). At the moment our primary input scenario is the keyboard, followed by braille display hardware. But I do recognize the potential of mouse and touch interaction and have been a vocal advocate for improving the user experience for touchscreen users (me included). My enthusiasm for touchscreens became the reason for creating Enhanced Touch Gestures add-on back in 2013 (and I've been buying touch-capable laptops since), and the potential of mouse interaction was perhaps the biggest reason for maintaining Golden Cursor for some time (I'm no longer maintaining Golden Cursor, by the way). In fact, the upcoming version of Windows App Essentials add-on will include support for improved mouse and touch interaction scenarios in an upcoming change to Windows 11 Version 22H2. Another possible reason for my enthusiasm for touchscreens (and mouse interaction to some extent) is that I had low vision until I lost my sight in my teens. Perhaps this experience helped me think in terms of spatial dimensions, drawing me to screen exploration techniques and helping me adjust easily to touch-enabled devices. I joined NVDA project after hearing a demo of NVDA running on a Samsung tablet with a preview build of Windows 8 installed in 2012, and like many, I was impressed with touch support on iOS and Android. Cheers, Joseph
|
|
Sorry for coming back to this so late, I haven’t looked at the list for a while. Yes, Object Nav doesn’t even cut it for the mouse like review that Window-Eyes offered. Even flat screen review doesn’t work nearly so well unfortunately. And no, JAWS isn’t there with proper mouse like review either. I miss Window-Eyes for its almost literal screen review mode. All the best Steve
toggle quoted message
Show quoted text
From: nvda@nvda.groups.io <nvda@nvda.groups.io> On Behalf Of Russell James Sent: Tuesday, January 10, 2023 1:39 PM To: nvda@nvda.groups.io Subject: Re: [nvda] Does NVDA read a webpage or a word document ? Hi Steve I use the GW micro vocalize and window eyes screen readers for many years and the accessibility provided by the mouse was very helpful to me! When I started using nvda I began to discover issues with mouse support and started reporting these and learned that this was not a high priority I still depend on these features everyday and continue to report related issues and recently realize that many of the new issues were coming from Firefox nightly and I'm trying to help identify and report those problems before they become issues for the stable Firefox browser. Fortunately the developers at Mozilla are making this a priority and are working hard to resolve and collaborate. Since using the Google TalkBack screen reader on Android and the explore by touch feature I now like to call this explore by mouse. I really like to have the ability to randomly access the content on the screen using the mouse cursor. I have never had the patience to access all the content on the screen that I wasn't interested in Reading On Tue, Jan 10, 2023, 5:37 AM Steve Nutt <steve@...> wrote: Hi Joseph, I have never had sight, but I have always loved Window-Eyes ability to let you move the mouse pointer on the screen either to the PC, or by lines, words, chars, or even pixels. No other screen reader to my knowledge has ever achieved that. Not only that, with Window-Eyes, the mouse was always “live” so you didn’t have to go into a special mode to read around it. Oh yes, there was a lot that Window-Eyes had that no screen reader has touched yet. All the best Steve
Hi, Mouse and touch interaction and navigation are always in our radar i.e. there were issues on GitHub in the past, and I did an all-nighter bac in May to bring improved mouse and touch interaction support for parts of Windows 11 user interface (it was both fun and a frustrating experience, to tell you the truth). At the moment our primary input scenario is the keyboard, followed by braille display hardware. But I do recognize the potential of mouse and touch interaction and have been a vocal advocate for improving the user experience for touchscreen users (me included). My enthusiasm for touchscreens became the reason for creating Enhanced Touch Gestures add-on back in 2013 (and I've been buying touch-capable laptops since), and the potential of mouse interaction was perhaps the biggest reason for maintaining Golden Cursor for some time (I'm no longer maintaining Golden Cursor, by the way). In fact, the upcoming version of Windows App Essentials add-on will include support for improved mouse and touch interaction scenarios in an upcoming change to Windows 11 Version 22H2. Another possible reason for my enthusiasm for touchscreens (and mouse interaction to some extent) is that I had low vision until I lost my sight in my teens. Perhaps this experience helped me think in terms of spatial dimensions, drawing me to screen exploration techniques and helping me adjust easily to touch-enabled devices. I joined NVDA project after hearing a demo of NVDA running on a Samsung tablet with a preview build of Windows 8 installed in 2012, and like many, I was impressed with touch support on iOS and Android. Cheers, Joseph
|
|
Hi Gene,
I'm pleased to hear yu are open to further discussion on this topic
I can imagine the day when I'm using a Windows based device with a touch screen It would be nice to use it the way I use my Android devices that have touchscreens On Android the Google Talkback screen reader supports explore by touch. This provides random access to the content on the display.
In the meantime, this is the way I have often used Windows based computers. However, instead of explore by touch it has been explore by mouse. Since Windows was released and that was using Window-Eyes.
On Android based devices that don't have a touchscreen I have used the mouse and explore by mouse is supported by Talkback.
I have found the simultaneous support of the keyboard and mouse to address my needs.
I have never been forced to use one or the other and hope that is never the case.
I have never used JAWS so I cannot say if that would address my needs...
Rus
toggle quoted message
Show quoted text
On Tue, Jan 10, 2023 at 10:47 AM Gene < gsasner@...> wrote:
I can't tell you exactly what Window-eyes offered compared with
other screen-readers but I am not convinced that some, perhaps a
good deal of your opinion is the result of not knowing some more
obscure features in JAWS. There is a command to move the JAWS
cursor in very small increments. I haven't used it for perhaps over
a decade and I learned about it a long time ago. And of course, you
can move by line and word and carachter using the JAWS cursor.
You may prefer the Window-eyes mouse being separate and using
separate commands. I much prefer learning one set of navigation
commands that work for the pc cursor and the mouse. Once I know one
set of commands, I know both.
The NVDA cursor may not be able to be moved by pixel, but you can
move it by line, word and character or to the top and bottom of the
screen. If you know screen review commands and object review
commands in NVDA, you can move the mouse to the position you set
screen review or the object navigator.
My point is not to just discuss Window-eyes compared with other
screen-readers. I'm writing because if people believe something is
limited, incorrectly, they won't look for how to do things that can
be done. I think the matter should be further discussed.
Gene
On 1/10/2023 4:37 AM, Steve Nutt wrote:
Hi Joseph,
I have never had sight, but I have
always loved Window-Eyes ability to let you move the mouse
pointer on the screen either to the PC, or by lines, words,
chars, or even pixels.
No other screen reader to my
knowledge has ever achieved that.
Not only that, with Window-Eyes, the
mouse was always “live” so you didn’t have to go into a
special mode to read around it.
Oh yes, there was a lot that
Window-Eyes had that no screen reader has touched yet.
All the best
Steve
Hi,
Mouse and touch interaction and navigation are always in our
radar i.e. there were issues on GitHub in the past, and I did
an all-nighter bac in May to bring improved mouse and touch
interaction support for parts of Windows 11 user interface (it
was both fun and a frustrating experience, to tell you the
truth). At the moment our primary input scenario is the
keyboard, followed by braille display hardware. But I do
recognize the potential of mouse and touch interaction and
have been a vocal advocate for improving the user experience
for touchscreen users (me included). My enthusiasm for
touchscreens became the reason for creating Enhanced Touch
Gestures add-on back in 2013 (and I've been buying
touch-capable laptops since), and the potential of mouse
interaction was perhaps the biggest reason for maintaining
Golden Cursor for some time (I'm no longer maintaining Golden
Cursor, by the way). In fact, the upcoming version of Windows
App Essentials add-on will include support for improved mouse
and touch interaction scenarios in an upcoming change to
Windows 11 Version 22H2.
Another possible reason for my enthusiasm for touchscreens
(and mouse interaction to some extent) is that I had low
vision until I lost my sight in my teens. Perhaps this
experience helped me think in terms of spatial dimensions,
drawing me to screen exploration techniques and helping me
adjust easily to touch-enabled devices. I joined NVDA project
after hearing a demo of NVDA running on a Samsung tablet with
a preview build of Windows 8 installed in 2012, and like many,
I was impressed with touch support on iOS and Android.
Cheers,
Joseph
|
|
I can't tell you exactly what Window-eyes offered compared with
other screen-readers but I am not convinced that some, perhaps a
good deal of your opinion is the result of not knowing some more
obscure features in JAWS. There is a command to move the JAWS
cursor in very small increments. I haven't used it for perhaps over
a decade and I learned about it a long time ago. And of course, you
can move by line and word and carachter using the JAWS cursor.
You may prefer the Window-eyes mouse being separate and using
separate commands. I much prefer learning one set of navigation
commands that work for the pc cursor and the mouse. Once I know one
set of commands, I know both.
The NVDA cursor may not be able to be moved by pixel, but you can
move it by line, word and character or to the top and bottom of the
screen. If you know screen review commands and object review
commands in NVDA, you can move the mouse to the position you set
screen review or the object navigator.
My point is not to just discuss Window-eyes compared with other
screen-readers. I'm writing because if people believe something is
limited, incorrectly, they won't look for how to do things that can
be done. I think the matter should be further discussed.
Gene
On 1/10/2023 4:37 AM, Steve Nutt wrote:
toggle quoted message
Show quoted text
Hi Joseph,
I have never had sight, but I have
always loved Window-Eyes ability to let you move the mouse
pointer on the screen either to the PC, or by lines, words,
chars, or even pixels.
No other screen reader to my
knowledge has ever achieved that.
Not only that, with Window-Eyes, the
mouse was always “live” so you didn’t have to go into a
special mode to read around it.
Oh yes, there was a lot that
Window-Eyes had that no screen reader has touched yet.
All the best
Steve
Hi,
Mouse and touch interaction and navigation are always in our
radar i.e. there were issues on GitHub in the past, and I did
an all-nighter bac in May to bring improved mouse and touch
interaction support for parts of Windows 11 user interface (it
was both fun and a frustrating experience, to tell you the
truth). At the moment our primary input scenario is the
keyboard, followed by braille display hardware. But I do
recognize the potential of mouse and touch interaction and
have been a vocal advocate for improving the user experience
for touchscreen users (me included). My enthusiasm for
touchscreens became the reason for creating Enhanced Touch
Gestures add-on back in 2013 (and I've been buying
touch-capable laptops since), and the potential of mouse
interaction was perhaps the biggest reason for maintaining
Golden Cursor for some time (I'm no longer maintaining Golden
Cursor, by the way). In fact, the upcoming version of Windows
App Essentials add-on will include support for improved mouse
and touch interaction scenarios in an upcoming change to
Windows 11 Version 22H2.
Another possible reason for my enthusiasm for touchscreens
(and mouse interaction to some extent) is that I had low
vision until I lost my sight in my teens. Perhaps this
experience helped me think in terms of spatial dimensions,
drawing me to screen exploration techniques and helping me
adjust easily to touch-enabled devices. I joined NVDA project
after hearing a demo of NVDA running on a Samsung tablet with
a preview build of Windows 8 installed in 2012, and like many,
I was impressed with touch support on iOS and Android.
Cheers,
Joseph
|
|
Agreed with you, pity that someone decide to kill that venerable window-eyes.
toggle quoted message
Show quoted text
From: nvda@nvda.groups.io <nvda@nvda.groups.io>
On Behalf Of Russell James via groups.io
Sent: Tuesday, 10 January 2023 14:39
To: nvda@nvda.groups.io
Subject: Re: [nvda] Does NVDA read a webpage or a word document ?
CAUTION: This email originated from OUTSIDE the Government Email Infrastructure. DO NOT CLICK LINKS
or OPEN attachments unless you recognise the sender and know the content is safe.
Hi Steve
I use the GW micro vocalize and window eyes screen readers for many years and the accessibility provided by the mouse was very helpful to me!
When I started using nvda I began to discover issues with mouse support and started reporting these and learned that this was not a high priority
I still depend on these features everyday and continue to report related issues and recently realize that many of the new issues were coming from Firefox nightly and I'm trying to help identify and report those problems before they become
issues for the stable Firefox browser.
Fortunately the developers at Mozilla are making this a priority and are working hard to resolve and collaborate.
Since using the Google TalkBack screen reader on Android and the explore by touch feature I now like to call this explore by mouse.
I really like to have the ability to randomly access the content on the screen using the mouse cursor.
I have never had the patience to access all the content on the screen that I wasn't interested in Reading
On Tue, Jan 10, 2023, 5:37 AM Steve Nutt <steve@...> wrote:
Hi Joseph,
I have never had sight, but I have always loved Window-Eyes ability to let you move the mouse pointer on the screen either to the PC, or by lines, words, chars, or even pixels.
No other screen reader to my knowledge has ever achieved that.
Not only that, with Window-Eyes, the mouse was always “live” so you didn’t have to go into a special mode to read around it.
Oh yes, there was a lot that Window-Eyes had that no screen reader has touched yet.
All the best
Steve
Hi,
Mouse and touch interaction and navigation are always in our radar i.e. there were issues on GitHub in the past, and I did an all-nighter bac in May to bring improved mouse and touch interaction support for parts of Windows 11 user interface (it was both
fun and a frustrating experience, to tell you the truth). At the moment our primary input scenario is the keyboard, followed by braille display hardware. But I do recognize the potential of mouse and touch interaction and have been a vocal advocate for improving
the user experience for touchscreen users (me included). My enthusiasm for touchscreens became the reason for creating Enhanced Touch Gestures add-on back in 2013 (and I've been buying touch-capable laptops since), and the potential of mouse interaction was
perhaps the biggest reason for maintaining Golden Cursor for some time (I'm no longer maintaining Golden Cursor, by the way). In fact, the upcoming version of Windows App Essentials add-on will include support for improved mouse and touch interaction scenarios
in an upcoming change to Windows 11 Version 22H2.
Another possible reason for my enthusiasm for touchscreens (and mouse interaction to some extent) is that I had low vision until I lost my sight in my teens. Perhaps this experience helped me think in terms of spatial dimensions, drawing me to screen exploration
techniques and helping me adjust easily to touch-enabled devices. I joined NVDA project after hearing a demo of NVDA running on a Samsung tablet with a preview build of Windows 8 installed in 2012, and like many, I was impressed with touch support on iOS and
Android.
Cheers,
Joseph
|
|
Hi Steve
Thanks for your post!
I use the GW micro vocalize and window eyes screen readers for many years and the accessibility provided by the mouse was very helpful to me!
When I started using nvda I began to discover issues with mouse support and started reporting these and learned that this was not a high priority
I still depend on these features everyday and continue to report related issues and recently realize that many of the new issues were coming from Firefox nightly and I'm trying to help identify and report those problems before they become issues for the stable Firefox browser. Fortunately the developers at Mozilla are making this a priority and are working hard to resolve and collaborate.
Since using the Google TalkBack screen reader on Android and the explore by touch feature I now like to call this explore by mouse. I really like to have the ability to randomly access the content on the screen using the mouse cursor. I have never had the patience to access all the content on the screen that I wasn't interested in Reading
Russ
toggle quoted message
Show quoted text
On Tue, Jan 10, 2023, 5:37 AM Steve Nutt < steve@...> wrote: Hi Joseph, I have never had sight, but I have always loved Window-Eyes ability to let you move the mouse pointer on the screen either to the PC, or by lines, words, chars, or even pixels. No other screen reader to my knowledge has ever achieved that. Not only that, with Window-Eyes, the mouse was always “live” so you didn’t have to go into a special mode to read around it. Oh yes, there was a lot that Window-Eyes had that no screen reader has touched yet. All the best Steve
Hi, Mouse and touch interaction and navigation are always in our radar i.e. there were issues on GitHub in the past, and I did an all-nighter bac in May to bring improved mouse and touch interaction support for parts of Windows 11 user interface (it was both fun and a frustrating experience, to tell you the truth). At the moment our primary input scenario is the keyboard, followed by braille display hardware. But I do recognize the potential of mouse and touch interaction and have been a vocal advocate for improving the user experience for touchscreen users (me included). My enthusiasm for touchscreens became the reason for creating Enhanced Touch Gestures add-on back in 2013 (and I've been buying touch-capable laptops since), and the potential of mouse interaction was perhaps the biggest reason for maintaining Golden Cursor for some time (I'm no longer maintaining Golden Cursor, by the way). In fact, the upcoming version of Windows App Essentials add-on will include support for improved mouse and touch interaction scenarios in an upcoming change to Windows 11 Version 22H2. Another possible reason for my enthusiasm for touchscreens (and mouse interaction to some extent) is that I had low vision until I lost my sight in my teens. Perhaps this experience helped me think in terms of spatial dimensions, drawing me to screen exploration techniques and helping me adjust easily to touch-enabled devices. I joined NVDA project after hearing a demo of NVDA running on a Samsung tablet with a preview build of Windows 8 installed in 2012, and like many, I was impressed with touch support on iOS and Android. Cheers, Joseph
|
|
Hi Joseph, I have never had sight, but I have always loved Window-Eyes ability to let you move the mouse pointer on the screen either to the PC, or by lines, words, chars, or even pixels. No other screen reader to my knowledge has ever achieved that. Not only that, with Window-Eyes, the mouse was always “live” so you didn’t have to go into a special mode to read around it. Oh yes, there was a lot that Window-Eyes had that no screen reader has touched yet. All the best Steve
toggle quoted message
Show quoted text
From: nvda@nvda.groups.io <nvda@nvda.groups.io> On Behalf Of Joseph Lee Sent: 22 December 2022 16:40 To: nvda@nvda.groups.io Subject: Re: [nvda] Does NVDA read a webpage or a word document ? Hi, Mouse and touch interaction and navigation are always in our radar i.e. there were issues on GitHub in the past, and I did an all-nighter bac in May to bring improved mouse and touch interaction support for parts of Windows 11 user interface (it was both fun and a frustrating experience, to tell you the truth). At the moment our primary input scenario is the keyboard, followed by braille display hardware. But I do recognize the potential of mouse and touch interaction and have been a vocal advocate for improving the user experience for touchscreen users (me included). My enthusiasm for touchscreens became the reason for creating Enhanced Touch Gestures add-on back in 2013 (and I've been buying touch-capable laptops since), and the potential of mouse interaction was perhaps the biggest reason for maintaining Golden Cursor for some time (I'm no longer maintaining Golden Cursor, by the way). In fact, the upcoming version of Windows App Essentials add-on will include support for improved mouse and touch interaction scenarios in an upcoming change to Windows 11 Version 22H2. Another possible reason for my enthusiasm for touchscreens (and mouse interaction to some extent) is that I had low vision until I lost my sight in my teens. Perhaps this experience helped me think in terms of spatial dimensions, drawing me to screen exploration techniques and helping me adjust easily to touch-enabled devices. I joined NVDA project after hearing a demo of NVDA running on a Samsung tablet with a preview build of Windows 8 installed in 2012, and like many, I was impressed with touch support on iOS and Android. Cheers, Joseph
|
|
Gene, I gave a direct link regarding Word's Immersive Reader Mode (there it is again - the brief video may prove helpful, but maybe not). I covered the territory for MS-Word and web browsers (or some of them, anyway). --
Brian - Virginia, USA - Windows 10 Pro, 64-Bit, Version 22H2, Build 19045; Office 2016, Version 16.0.15726.20188, 32-bit
"Be Yourself" is the worst advice you can give to some people.
~ Tom Masson
|
|
Aren't there utilities that allow you to select text and have it
read intended for use with a mouse? That might be as good a
solution or better in some cases. I believe the person wants to
have documents read in Word, or parts of them.
Gene
On 12/22/2022 5:14 PM, Brian Vogel
wrote:
toggle quoted message
Show quoted text
On Thu, Dec 22,
2022 at 02:31 PM, Gene wrote:
Are you talking about the read aloud mode or some
other aspect of use?
-
Yes. What seemed to me to be the core element of the original
post was this, "But I can not figure out how to make this program
read a webpage or a word docuemnt. (sic)"
To be perfectly honest, given the whole context given, and then
learning afterward that the original poster was sighted, I do not
believe a screen reader is the best way to get the result he
appears to be looking for. Read-aloud modes seem to be a far
better tool for the desired task/outcome.
--
Brian - Virginia,
USA - Windows
10 Pro, 64-Bit, Version 22H2, Build 19045; Office 2016,
Version 16.0.15726.20188, 32-bit
"Be Yourself" is the worst
advice you can give to some people.
~ Tom Masson
|
|
On Thu, Dec 22, 2022 at 02:31 PM, Gene wrote:
Are you talking about the read aloud mode or some other aspect of use?
- Yes. What seemed to me to be the core element of the original post was this, "But I can not figure out how to make this program read a webpage or a word docuemnt. (sic)" To be perfectly honest, given the whole context given, and then learning afterward that the original poster was sighted, I do not believe a screen reader is the best way to get the result he appears to be looking for. Read-aloud modes seem to be a far better tool for the desired task/outcome. --
Brian - Virginia, USA - Windows 10 Pro, 64-Bit, Version 22H2, Build 19045; Office 2016, Version 16.0.15726.20188, 32-bit
"Be Yourself" is the worst advice you can give to some people.
~ Tom Masson
|
|
As a field observation, a number of kiosks have an on-screen accessibility button. The problem being that even if it triggers screen reader functions, the screen reader isn't on when you need to find it. I'll have to activate one at some point to see what happens. --
Brian - Virginia, USA - Windows 10 Pro, 64-Bit, Version 22H2, Build 19045; Office 2016, Version 16.0.15726.20188, 32-bit
"Be Yourself" is the worst advice you can give to some people.
~ Tom Masson
|
|
Are you talking about the read aloud mode or some other aspect of
use?
Gene
On 12/22/2022 11:02 AM, Brian Vogel
wrote:
toggle quoted message
Show quoted text
It seems to me that
based on the original desired behaviors, if one is using MS-Word
2019 or newer, then using Word's Immersive
Reader Mode is far more likely to give the desired end
result.
Most Chromium-based web browsers these days also feature a reader
mode that's more likely to give what's desired for a sighted user
with minimal "muss and fuss" than a screen reader is likely to.
Use
Immersive Reader in Microsoft Edge - Microsoft Support
--
Brian - Virginia,
USA - Windows
10 Pro, 64-Bit, Version 22H2, Build 19045; Office 2016,
Version 16.0.15726.20188, 32-bit
"Be Yourself" is the worst
advice you can give to some people.
~ Tom Masson
|
|

Chris Smart
I'm pleased to hear that you can use nvda and your Apple products As others have said, NVDA is a Windows screen reader. Voiceover is the name of the screen reader built into iOS and Mac OS. Some users may not use these things the way that you do and not all text is large blocks :-) If you ever find yourself at a Windows based touch screen kiosk without a keyboard I hope it will be accessible.. Yes, I hope so too. Apparently some can now run Jaws, either for Windows or Android.
|
|
It seems to me that based on the original desired behaviors, if one is using MS-Word 2019 or newer, then using Word's Immersive Reader Mode is far more likely to give the desired end result. Most Chromium-based web browsers these days also feature a reader mode that's more likely to give what's desired for a sighted user with minimal "muss and fuss" than a screen reader is likely to. Use Immersive Reader in Microsoft Edge - Microsoft Support--
Brian - Virginia, USA - Windows 10 Pro, 64-Bit, Version 22H2, Build 19045; Office 2016, Version 16.0.15726.20188, 32-bit
"Be Yourself" is the worst advice you can give to some people.
~ Tom Masson
|
|