after months of activity in this community I have discovered a lot of interesting discussions and opinions in Github when looking through several NVDA issues which are open since more than 5 years. And as most of you might already know, there are over 2250 issues open at this time. Unfortunately there are a lot of discussions which are not updated. There are a lot of issues which actually are solved or propably solved but are still open. Many people in this list created different issues related to NVDA on Github. But at the same time, many people did not care about those issues and they did not test if they still occur with last NVDA versions.
I would like to encourage you all who created issues on Github: please answer our questions on Github when we comment on your issues. If you are not using the old software that you created issues for, please take 10 minutes of your life and installt the last version and try to reproduce the issue again. Please comment on your issues and tell us if it is reproducible or not. I can talk at least for myself, I spend sometime hours to restart discussions on old issues or to try to get answers from involved people. But most of them do not react. This is indeed frustrating because in one or two years someone else will ask the same questions and in the end we will have far too much issues which are just noise or low quality.
This message is also for all developers who are or were involved in old issues. Please take some minutes and continue discussions or answer our questions when we try to decide how to deal with a certain issue. This helps us to decide if issues can be closed or not.
The issue page for NVDA on github is very overloaded and this makes it also very difficult to atribute certain pull requests to certain issues. Let us work together and transform the quantity of issues in quality. This will make it easier for developers to see where they can work for. For users, a better overview will lead to less duplicate issues and to less frustration.
Thank you very much for your attention.