Opened 17 years ago

Last modified 14 years ago

#160 closed Bug report

Lockups On Large File Structures

Reported by: overcast Owned by:
Priority: high Component: Other
Keywords: Cc: overcast, renhoek, booyeah451, codeine, Tim Kosse
Component version: Operating system type:
Operating system version:

Description

If i attempt to transfer an entire web site directory with
many directories/files inside of directories filezilla
attempts to cue up every single file and appears to
overflow whatever buffer it has for the queue listing. Why
bother cue up every single filename? It basically has to
go and search for every single file before even
transferring anything. So even if it didn't freeze up i'd
have to sit there for 10 minutes while it queues
everything.

Change History (4)

comment:1 Changed 17 years ago by renhoek

I have to agree with this 'bug.. I tried to download a
LARGE structure with hundreds of directories and even after
an eternity it _still_ wasn't done indexing the directories..

In my opinion a large waste of time and bandwidth.. I'd
rather have an implementation that just starts to download
and jumps into a directory when it encounters one. (This is
also CuteFTPpro behaviour)

At least make it a switchable option :)

comment:2 Changed 17 years ago by booyeah451

I ran into the same problem. This is a big deal if it is
going to be usefull to mirror large web sites.

comment:3 Changed 16 years ago by codeine

i have similar problem, if i have file/dir list with about 15000
entries it shows it correctly, but causes lockup when saving
(nothing is saved).

comment:4 Changed 14 years ago by Tim Kosse

This bug report has been closed due to inactivity and has possibly
already been solved.

You can reopen this report if the issue still exists in the
latest version of FileZilla (Server).

Note: See TracTickets for help on using tickets.