You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I had to restore a backup from Amazon S3 that was created with rsync. There are deep folder hierarchies with hundreds of files. Cyberduck is one of the few tools that can connect to S3 and download a folder hierarchy. But it's excruciatingly slow, and not on account of the network speed.
In particular there seems to be a very long delay after the file tree has been retrieved. First Cyberduck appears to be walking the file tree and building a list. Then it appears to be sorting or doing something in the background. There is no UI feedback for minutes before the actual download starts. The download itself also crawls along. Not sure if there's some code that is not optimized for a large folder hierarchy or if the S3 API gets in the way?
A copy of a folder hierarchy with 4,703 files and 248MB in size took about 41 minutes! As a single file this would have finished in under a minute with exact same configuration.
When restoring backups time is of the essence. Would be fantastic if this could be optimized.
The text was updated successfully, but these errors were encountered:
I had to restore a backup from Amazon S3 that was created with rsync. There are deep folder hierarchies with hundreds of files. Cyberduck is one of the few tools that can connect to S3 and download a folder hierarchy. But it's excruciatingly slow, and not on account of the network speed.
In particular there seems to be a very long delay after the file tree has been retrieved. First Cyberduck appears to be walking the file tree and building a list. Then it appears to be sorting or doing something in the background. There is no UI feedback for minutes before the actual download starts. The download itself also crawls along. Not sure if there's some code that is not optimized for a large folder hierarchy or if the S3 API gets in the way?
A copy of a folder hierarchy with 4,703 files and 248MB in size took about 41 minutes! As a single file this would have finished in under a minute with exact same configuration.
When restoring backups time is of the essence. Would be fantastic if this could be optimized.
The text was updated successfully, but these errors were encountered: