Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Downloads of 850MB or larger files fail on S3 with "access denied" error when "Segmented downloads with multiple connections per file" is checked #11581

Closed
cyberduck opened this issue Feb 8, 2021 · 3 comments

Comments

@cyberduck
Copy link
Collaborator

9103a4d created the issue

With the default setting (under Preferences... Transfers... General tab, Downloads section) "Segmented downloads with multiple connections per file" is checked, attempting to download a file larger than about 850MB fails after just a few seconds with an "Access Denied" error. Files 800MB or smaller are able to download successfully.

When I uncheck the "Segmented downloads with multiple connections per file" setting, I am able to download large files (850MB to 3GB range).

My guess is that this error is occurring when trying to open multiple local files for the different download file pieces under Windows. Too many files open, or too many connections? It is not an issue with the S3 access permissions. All object permissions in the S3 bucket are the same.

Side notes: I tried to turn on debug logging by adding logging=debug in a newly created default.properties file in the %APPDATA%\Cyberduck folder, and although a cyberduck.log file was created, the file remained empty. Also, under Windows the "Access Denied" dialog does not allow you to copy and paste the message details, so I am attaching screenshots of the details as a PDF.


Attachments

@cyberduck
Copy link
Collaborator Author

@dkocher commented

This is possibly a duplicate of #10726. What is the length of the filename?

@cyberduck
Copy link
Collaborator Author

9103a4d commented

Interesting! The total length of the full path and filename is 129 characters. The filename itself is long: 103 characters. Still, it's hard to understand why this would exceed NTFS limits, which are 255 characters for an individual file or folder name, and 260 characters by default for the total path length. (But per this article, the limitation can be removed to support up to 32,767 characters).

Ah, I see that CyberDuck's temporary download path effectively more than doubles the filename length. With a download folder of "C:\Users\Martin\Downloads\" and a 103 character long filename of "20200827_this-is-an-extremely-very-very-very-very-very-long-big-lengthy-filename-here_1of1_1080p_en.mp4", CyberDuck first creates a download folder named:

C:\Users\Martin\Downloads\20200827_this-is-an-extremely-very-very-very-very-very-long-big-lengthy-filename-here_1of1_1080p_en.mp4.cyberducksegment\

but then cannot create the first segment filename, which would be 270 characters:

C:\Users\Martin\Downloads\20200827_this-is-an-extremely-very-very-very-very-very-long-big-lengthy-filename-here_1of1_1080p_en.mp4.cyberducksegment\20200827_this-is-an-extremely-very-very-very-very-very-long-big-lengthy-filename-here_1of1_1080p_en.mp4-1.cyberducksegment

To further test this, I tried "Download as..." and gave the downloaded file a very short name. Indeed, with a short name, the file download continued without error even when "Segmented downloads with multiple connections per file" is checked.

I've configured my Windows registry (Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem\LongPathsEnabled set to 1) to allow along paths. However, I was still not able to download the long filename after restarting CyberDuck. It appears that CyberDuck does not support long Windows paths via the longPathAware element in its application manifest (per this article?

Or, alternatively, since most Windows users will not make the necessary Windows registry change to support long paths, could CyberDuck use a different naming convention for its download segments to avoid the otherwise inevitable more than doubling of the filename length?

@cyberduck
Copy link
Collaborator Author

@dkocher commented

Thanks for the profound reply. I have reopened #10726.

@iterate-ch iterate-ch locked as resolved and limited conversation to collaborators Nov 27, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant