Custom Query (8105 matches)

Filters
 
Or
 
  
 
Columns

Show under each result:


Results (160 - 162 of 8105)

Ticket Resolution Summary Owner Reporter
#7862 outdated SCTP protocol support CSRedRat
Description

Protection and reliability above all else. Nowadays, information plays a significant role in my life. In connection with this very appropriate introduction would be implementing the protocol SCTP (Stream Control Transmission Protocol - «transfer protocol with flow control"), a new version. This will provide protection against SYN-flood attacks, the establishment of a safe connection (using a four-handshake), as well as pleasant innovation in the form of the conservation message boundaries, multi-threading, unordered delivery, support for multiple interfaces. Implementation of this protocol allows the network to bring a new level of speed, reliability, security and data transmission capabilities over the network. Because the new protocol was created, taking into account shortcomings of TCP, in view of the network and make full use of their opportunities, and special attention was paid to safety and security.

It is high time to promote, SCTP everywhere! Given the current active movement towards IPv6. + A lot of goodies from this extract is another good protocol SPDY. You can not hinder the development of technologies and the need to push modern protocols to the masses. It remains to convince Microsoft's usefulness and necessity of these protocols and to persuade them to intensive implementation. By the way Firefox is already implementing a protocol support SPDY (expected in version 11), as well as support going into the popular web-server nginx.

#11620 rejected Cached/displayed file size wrong after reboot - causes resume to start at wrong place with larger files Michael Shepard
Description

When downloading 100GB backup files the overwrite/resume prompt window shows old (cached?) size info about the local files after an unexpected reboot during long download (Enforced Windows Update reboot).

The Local site file listing also shows old data. Hitting F5 multiple times to refresh sometimes updates the window to the current file size. Without knowing it has to be refreshed, FileZilla will redownload large parts of the file.

Event Sequence: Start downloads at 9 am after backup finishes. File download is progressing to 80GB. Unexpected reboot happens at night. Restart the machine and reload Filezilla. Local site and overwrite/resume window show the file progress is at 50GB (the last server reconnect perhaps?) and resumes from 50GB point. Window Explorer shows file was 70GB before download resume.

Thanks for the great tool!

From About:

FileZilla Client


Version: 3.33.0

Build information:

Compiled for: x86_64-w64-mingw32 Compiled on: x86_64-pc-linux-gnu Build date: 2018-05-07 Compiled with: x86_64-w64-mingw32-gcc (GCC) 6.3.0 20170516 Compiler flags: -g -O2 -Wall

Linked against:

wxWidgets: 3.0.5 SQLite: 3.22.0 GnuTLS: 3.5.18

Operating system:

Name: Windows 10 (build 16299), 64-bit edition Version: 10.0 Platform: 64-bit system CPU features: sse sse2 sse3 ssse3 sse4.1 sse4.2 avx aes pclmulqdq rdrnd Settings dir: C:\Users\media\AppData\Roaming\FileZilla\

#11621 rejected Faster downloads online with simultaneous download connections of large files divided into chunks Michael Shepard
Description

When download online (not downloading from a local network server) users are downloading with various limiters blocking the speed of the download per connection: the host website/server, ISPs, company router, traffic shapers, shared wifi access points.

Typically we find when downloading backups Filezilla will quickly work through all the small files using up to 10 connections (Settings > Transfers > Maximum Simultaneous Transfers) and then sit there using only one connection per file slowly downloading the (say 2) last large files for hours since the single connection is rate limited externally.

Suggestion: Divide files larger than x (New Setting) into chunks. Divide the file size by the number of Maximum Simultaneous Connections into even chunks and place each chunk in the transfer queue as a separate file. Each chunk processed in the queue as a regular file download. Use the existing preallocate space for download feature to set up the file for writing or stitch the chunks together after.

Multiple connections will then download different parts of the same file. If the other files in the transfer have a higher priority then once a queue connection runs out of other regular files to download and becomes idle, it starts on the next chunk of the large file. Or the user can use the existing priority feature to go after the large file first.

Example: If the file is 100GB (and there are 10 Maximum Simultaneous Connections) then connection #2 downloads from the 10GB mark to the 20GB mark and so on.

External to Filezilla, the user would be required to connect to a remote server that allows multiple simultaneous connections, which is typically already enabled. Most sites we use let us download 10 files at the same time.

Please ask me if you have any questions. Thanks for the great program!

Batch Modify
Note: See TracBatchModify for help on using batch modify.
Note: See TracQuery for help on using queries.