Jacob Kruger wrote:
As in, the web page designers are not what I would refer to as professional, or efficient/effective,but, bit strange that NVDA is then ignoring the spacesQuite right. I think this example may get closer to what's happening. If you paste this into a Chromium browser's address bar as one line:
data:text/html,<p>This<!-- --> <!-- -->is<!-- --> a <!-- --> <!--
Then press enter, you will see only the first two words are joined with no space.
Those are the only two that are formatted exactly like the original site.
The rest all either have an extra space in between, or have no second comment.
In HTML, multiple spaces are collapsed to a single space in the presentation layer.
Also, comments are silently ignored, because they have no displayable element.
I suspect what's happening here, is something is seeing the comment.
It starts a new processing run.
It sees the space.
It's still looking, because so far it found one definitely ignored thing, and one optionally ignorable thing.
Then it finds another comment.
Now what's a poor algo to do? It has now found another definitely ignored thing, with an ignorable thing cached.
So it drops them both and starts over.
Whereas when it gets to the space followed by another space, it knows at least one of them has to be presented, so it presents before starting the next cycle.
Same when it reaches the word that isn't a comment.
That's a humanification of what's probably happening, and I suspect it's going to take the guys who work with the C code to deal with this.