Opened 11 years ago
Closed 9 years ago
#8249 closed Bug report (outdated)
when downloading a directory tree, a random part of it may not be downloaded
|Reported by:||matteo sisti sette||Owned by:|
|Component version:||Operating system type:||Linux|
|Operating system version:||ubuntu 12.04|
I right-clicked on a directory (that contained a lot of subdirectories with thousands of files) on the server and selected "download".
During the download process I lost connection a few times and sometimes had to restart filezilla (because of another bug, that sometimes you can't resume processing a queue until you restart filezilla), but then I always resumed; that shouldn't be a problem at all. I never, ever deleted anything from the queue nor did I ever receive any error prompt to which I could have given the wrong answer.
After hours of downloading, the queue was finally empty and FileZilla was apparently done downloading.
Then I found out that more than half of the directory tree simply HADN'T BEEN DOWNLOADED.
I don't know whether the missing directories have never got into the queue (I'm ALMOST sure this was the case) or if they did and then they failed to download, but in both cases SILENTLY failing to download is a disaster.
If a queue becomes empty and you haven't explicitely deleted anything from it, you must be 100% sure that everything in it has been downloaded. And if you download a folder, you must have 100% guarantee that all its contents get added to the queue.
I SUSPECT that the queue is built (i.e. the directory tree scanned) while the download is ongoing, so in case of huge directories, if the download is interrupted and resumed, maybe the scanning of the directory is not resumed properly. If that is the case, than that is the bug.
Either the whole scanning of directories and queuing of files should be done prior to starting to download, OR actions should be taken to ensure that in case of interruptions the scanning is resumed reliably.
Assuming that downloading a directory tree is always done without the slightest network error or without restarting FileZilla from the beginning to the end in one chunk, is a ridiculous assumption.
Change History (6)
comment:1 by , 11 years ago
comment:2 by , 11 years ago
That's definitely the problem. The scanning of the directory tree is done while downloaded (after one or a few subdirectories have been downloaded, new ones are scanned and the files queued). So, if for whatever reason the download is interrupted, the scannin is too, and it is never resumed.
This renders FileZilla almost useless for any serious use other than manually downloading few single files or very small directories (the latter always with a small risk of silently failing to download part of the contents and not detecting it).
If you need to download a big directory tree, you have to manually split the operation into smaller subtrees in order to minimize the risk of failure, and when done, manually check directory-by-directory that everything has been downloaded. Which is obviously unviable.
Whenever a long operation is initiated (in this case downloading a directory tree), it is of VITAL importance that you have a 100% guarantee that
EITHER it will be completed succesfully, no matter how long it will take and no matter how many times your network crashes or you restart the client meanwhile,
OR whenever a problem is encountered options are given to the user in order to be able to resume the process, without restarting from scratch.
comment:3 by , 11 years ago
This seem to have happened in earlier versions of FileZilla as well, even without(?) any other noticed crashes/restarts, see #4061.
comment:4 by , 11 years ago
comment:5 by , 9 years ago
|Status:||new → moreinfo|
Are you able to reproduce this behavior with the latest version of FileZilla?
If so, could you please details your setup and steps to reproduce?
comment:6 by , 9 years ago
|Status:||moreinfo → closed|
No reply for more than 28 days.
I can confirm that's definitely the case.