Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cache of directory listings needed for resume to be useful #1047

Closed
cyberduck opened this issue Feb 24, 2007 · 1 comment
Closed

cache of directory listings needed for resume to be useful #1047

cyberduck opened this issue Feb 24, 2007 · 1 comment
Assignees
Milestone

Comments

@cyberduck
Copy link
Collaborator

toomanyhandles created the issue

latest Cyberduck as of 02/23/07, 10.4.8, connectiong to OpenSSH on NetBSD server for sftp transfers.

6 gigs of data in 5 directories (/home users), over wireless, so sometimes disconnects happen.
When a disconnect happens, instead of being able to handily resume, it has to parse the entire remote file structure for direcotry listings again. It takes so long to do this, that I've not had time to wait and watch, I am not even convinced that resume (green button) works properly, it could be starting completely from scratch again, based on the total I've got downloaded so far (seems like I've done the first 2 gigs repeatedly, perhaps....

Shouldn't the directory listing get cached so a resume can handily resume? seem to me that is the way ncftp3 and other work in cases of a disconnect and a resume command is issued...

@cyberduck
Copy link
Collaborator Author

@dkocher commented

Caching takes up too much memory.

@iterate-ch iterate-ch locked as resolved and limited conversation to collaborators Nov 26, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants