Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Connection closed due to timeout while transferring extremely long files #1196

Closed
cyberduck opened this issue Jul 31, 2007 · 3 comments
Closed
Assignees
Labels
bug sftp SFTP Protocol Implementation wontfix

Comments

@cyberduck
Copy link
Collaborator

90e8093 created the issue

There is a problem with transferring extremely long files (multiple gigabytes in size) that the connection gets closed during the transfer. My assumption is that there is a timeout on the server side that closes the whole connection when no command (is there a separate command channel?) gets sent for a given time. It's not the problem of the current file position because the transferred size is always less than 2GB (so it's not a 32 bit number limitation).

@cyberduck
Copy link
Collaborator Author

@dkocher commented

Have you tried the setting in Preferences > Connection > Send no operation to keep connection alive?

@cyberduck
Copy link
Collaborator Author

90e8093 commented

Yes I tried this setting (I tested it with both being set/cleared).

@cyberduck
Copy link
Collaborator Author

@dkocher commented

You might want to try a SCP transfer or find out what setting is causing the server to close the connection. (There is no seperate command connection for SSH).

@iterate-ch iterate-ch locked as resolved and limited conversation to collaborators Nov 26, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug sftp SFTP Protocol Implementation wontfix
Projects
None yet
Development

No branches or pull requests

2 participants