Re: Portable version degrading


 

Hi,
Fragmentation will happen as long as new information is written in places that'll cause problems for fast reading later. Also, while something is running, the operating system will still need to access things on disk if asked by the program.
As for swapping configurations: in theory, yes as long as the versions are compatible enough to not cause visible side effects. For example, if one swaps configurations between stable and next branches, that could raise problems in that some things required by next snapshots might not be present.
As for the add-on being the culprit: could be. One thing to try though: what if Roger runs his portable copy with all add-ons disabled? If that improves performance, then it could be an add-on, if not, we should try something else.
Implicating file systems: Roger did say this is an internal drive, hence I put more weight on possible fragmentation and data movement issues.
Cheers,
Joseph

-----Original Message-----
From: nvda@nvda.groups.io [mailto:nvda@nvda.groups.io] On Behalf Of Didier Colle
Sent: Friday, January 19, 2018 3:40 PM
To: nvda@nvda.groups.io
Subject: Re: [nvda] Portable version degrading

Dear Joseph, roger, all,


@Joseph: not sure to understand what point you try to make. Is your suggestion there is indeed a filesystem problem as the root cause?


trying to recapitulate a few things:

* "it can make it appear that the add on is defective or has a bug while
it really doesn't."

@Roger: for any further meaningful diagnosis, I believe a more concrete
symptom description is needed? (how does such "would be" bug manifestate
itself? Is it always the same "would be" bug or do many "would be" bugs
appear randomly? when do such "would be" bugs appear (during loading,
during execution of the add-on)?)

* "there's no file system errors"

I guess that means there are no issues with the
physical/electronic/magnetic integrity of the storage medium itself (or
that the filesystem has set them aside such that they are not used
anymore). In case corrupted/broken blocks on the storage medium would be
the root cause, something should be found in the logs as loading the
relevant python modules should throw an exception (if these exceptions
are not logged, it should be possible to do so). Therefore, I dismiss
storage medium/filesystem corruption as root cause of the above
mentioned "would be" bugs (assuming bugs have to be interpreted as
broken functionality).


* "I also notice a few functions of nvda either don't work at all or
nvda gets very sluggish in responsiveness"
@Roger: again, for any meaningfull diagnosis, provide a more concrete
symptiom description. What functions are you exactly speaking about?
What does "not work at all" exactly mean: do you mean sluggishness with
extremely long / infinite response times? Or do you get errors? or ...
Is the sluggishness general or does it happen in those specific
functions? What do you mean by sluggishness: response in only a second?
A few seconds? A minute or more? When does sluggishness happen: at time
of loading add-on/modules or continuously or ...?
* "... nor any fragmenting.". Statement from Joseph: "In case of Roger's
issue: a possible contributing factor is constant add-on updates. He
uses an add-on that is updated on a regular basis, .. ..., potentially
fragmenting bits of files ..."
The two statements appear to me as contradictory. Fragmentation may be a
root cause of sluggishness, but only when access to storage medium is
needed and not during general execution which typically takes place from
RAM rather then from disc. Therefore, fragmentation issues appear very
unlikely to me.

* "while the installed version is always stable as a rock." and "I use
the portable copy to test a couple add ons"
@Roger: how much do you use one and the other? How much usage does it
take before the portable copy gets degraded?
The two statements suggest there is a problem with the portable copies.
However, there seems to be nobody else experiencing the same problem.
Thus, I would translate this into the following question that you would
need to test/investigage further: is there a conflict between the
portable copies and your specific system setup, or is the issue caused
by the add-ons under test?
To test the former possibility, why not using a fresh portable copy
replicating the setup of your installed version instead of that
installed version for a while?
To test the latter that would probably require moving the add-on testing
to the installed version: I guess you are using the portable version for
this purpose, exactly to avoid messing up the installed version. Would
you have the possibility to do the testing in for example a virtual
machine, such that you can test on an installed instead of a portable
copy version, while not messing up your main system with this testing?
Joseph, anyone else: is there a (possibly more cumbersome) way to
perform testing on an installed version while keeping at all times a
possibility to revert back to a stable/clean situation? (e.g., having a
.bat script that swaps configuration file and add-on directories between
stable and testing versions and that can easily be executed in between
exiting nvda and restarting it?)
In case none of the above options is tried, my suggestions would be then
to regularly take snapshot copies of your portable copy such that when
degradation takes place a diff between stable and degraded version can
be taken and investigated.

In summary, I believe:
1) a much more concrete/detailed/... symptom description is needed
before any meaningful statements regarding diagnosis is possible;
2) with the info I have, filesystem/storage medium problems/corruptions
are very unlikely.
3) further testing/investigation is needed in order to support/dismiss
certain hypotheses.

Kind regards,

Didier

On 19/01/2018 18:19, Joseph Lee wrote:
Hi,
It'll depend on what type of drive it is. If it's a traditional hard drive,
it'll degrade as data moves around, creating the need for defragmentation.
This is especially the case when data is repeatedly written and the file
system is asked to find new locations to hold the constantly changing data.
In case of solid-state drives, it'll degrade if the same region is written
repeatedly, as flash memory has limited endurance when it comes to data
reads and writes.
In case of Roger's issue: a possible contributing factor is constant add-on
updates. He uses an add-on that is updated on a regular basis, putting
strain on part of the drive where the add-on bits are stored. Thus, some
drive sectors are repeatedly bombarded with new information, and one way
operating systems will do in this case is move the new data somewhere else
on the drive, potentially fragmenting bits of files (I'll explain in a
moment). Thus one solution is to not test all add-on updates, but that's a
bit risky as Roger is one of the key testers for this add-on I'm talking
about.
Regarding fragmentation and what not: the following is a bit geeky but I
believe you should know about how some parts of a file system (an in
extension, operating systems) works, because I believe it'll help folks
better understand what might be going on:
Storage devices encountered in the wild are typically organized into many
parts, typically into blocks of fixed-length units called "sectors". A
sector is smallest unit of information that the storage device can present
to the outside world, as in how much data can be held on a storage device.
For example, when you store a small document on a hard disk drive (HDD) and
when you wish to open it in Notepad, Windows will ask a module that's in
charge of organizing and interpreting data on a drive (called a file system)
to locate the sector where the document (or magnets or flash cells that
constitute the document data) is stored and bring it out to you. To you, all
you see is the path to the document, but the file system will ask the drive
controller (a small computer inside hard disks and other storage devices) to
fetch data in a particular sector or region. Depending on what kind of
storage medium you're dealing with, reading from disks may involve waiting
for a platter with desired sector to come to the attention of a read/write
head (a thin magnetic sensor used to detect or make changes to magnetic
fields) or peering inside windows and extracting electrons trapped within.
This last sentence is a vivid description of how hard disks and solid-state
drives really work behind the scenes, respectively.
But storage devices are not just meant for reading things for your
enjoyment. Without means of storing new things, it becomes useless.
Depending on the medium you've got, when you save something to a storage
device, the file system in charge of the device will ask the drive
controller to either find a spot on a disk filled with magnets and change
some magnets, or apply heat pressure to dislodge all cells on a block, erase
the block, add new things, and fill the empty block with modified data
(including old bits). You can imagine how tedious this can get, but as far
as your work is concerned, it is safe and sound.
Now imagine you wish to read and write repeatedly on a storage device. The
file system will repeatedly ask the drive hardware to fetch data from
specific regions, and will look for new locations to store changes. On a
hard drive, because there are limited number of heads and it'll take a while
for desired magnetic region to come to attention of one, read speed is slow,
hence increased latency (latency refers to how long you have to wait for
something to happen). When it comes to saving things to HDD's, all the drive
needs to do is tell the read/write head to change some magnets wherever it
wishes, hence data overriding is possible and easy. But operating systems
(rather, file systems) are smarter than that, as we'll see below.
In case of solid-state drives, reading data is simple as looking up the
address (or sector) where the electrons comprising the data you want is
saved (akin to walking down a street grid), so no need to wait for a sensor
to wait for something to happen. This is the reason why solid-state drives
appear to respond fast when reading something. On the other hand, writing or
injecting electrons is very slow because the drive needs to erase the entire
block before writing new data. In other words, just changing a letter in a
document and saving it to an SSD involves a lot of work, hence SSD's are
slower when it comes to writing new things, but because of the underlying
technology in use, it is way faster than hard disks.
As hinted above, file systems are smarter than drive controllers to some
extent. If data is written to a drive, the drive controller will process
whatever it comes along its path. But file systems won't let drive
controllers get away with that: file systems such as NTFS (New Technology
File System) will schedule data writes so it'll have minimal impact on the
lifespan of a storage device. For hard disks, it'll try its best to tell the
drive to store file data in consecutive locations in one big batch, but that
doesn't always work. For SSD's, the file system will ask the drive to
storage new information in different cells so all regions can be used
equally (at least for storing new information; this is called ware
leveling). One way to speed things up is asking the drive to reorganize data
so file fragments can be found in consecutive sectors or trim deleted
regions so fresh information can be written to more blocks (for HDD's and
SSD's, respectively), and this operation itself is tedious and produce bad
results if not done correctly and carefully.

I do understand the above explanation is a bit geeky, but I believe you need
to know some things about how things work. It is also a personal exercise to
refresh my memory on certain computer science topics (I majored in it not
long ago, and my interests were mostly hardware and operating systems, hence
I was sort of naturally drawn to screen reader internals and how it
interacts with system software).
Cheers,
Joseph

-----Original Message-----
From: nvda@nvda.groups.io [mailto:nvda@nvda.groups.io] On Behalf Of Roger
Stewart
Sent: Friday, January 19, 2018 7:58 AM
To: nvda@nvda.groups.io
Subject: Re: [nvda] Portable version degrading

The problem with this discussion is my portable version is on an internal
hard drive. So why is this degrading?

Nothing else on this drive has any trouble and I've checked, and there's no
file system errors nor any fragmenting.


Roger












On 1/19/2018 8:28 AM, Antony Stone wrote:
USB drives do need to be unmounted before removing them, otherwise there
is
the risk of file system corruption. Precisely the same is true for
external
hard drives, floppy disks, or any other writeable medium you can
temporarily
attach to a computer.

I've never seen a USB thumb drive fall apart, and I think they're
considerably
more robust than floppy disks, which is basically what they replaced. You
can
also drop them on the floor with a good deal more confidence of them
working
afterwards than if you drop an external hard disk.

Yes, they're vulnerable to static electricity; that's why most of them
have
plastic caps to put over the contacts or a slider to retract the contacts
into
the body.

My experience is that if they're treated reasonably they work very well.
If
they're mistreated they'll give as many problems as any other mistreated
storage medium.


Antony.

On Friday 19 January 2018 at 15:17:36, tonea.ctr.morrow@faa.gov wrote:

A few years back, I had a job for three years where people brought me
their
files on USB thumb drives. These things are horrible in terms of
long-life. The really do have to be unmounted prior to removing from the
computer or they get corrupted. They physically fall apart easily. And,
the hardware inside seems to be more vulnerable to static electricity
data
loss than other portable drives, certainly more vulnerable than most
computers.



I would think that would be the problem.



Tonea



-----Original Message-----

I've noticed over the past couple years that my portable install of nvda
will sometimes degrade or get a bit corrupted over time all by itself
while the installed version is always stable as a rock. Does anyone know
why this is and is there any way to prevent this from happening? I use
the portable copy to test a couple add ons and if the portable version
corrupts, it can make it appear that the add on is defective or has a bug
while it really doesn't. Deleting the portable copy and making a new one
will clear it up. I also notice a few functions of nvda either don't
work
at all or nvda gets very sluggish in responsiveness and this all gets
back
to normal after a complete flush and remake of the portable version. As
I
say, this never has happened at all with my installed copy on the same
computer.





Roger





Join nvda@nvda.groups.io to automatically receive all group messages.