New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Retrieve default access key from standard AWS SDK credentials for opened links #10582
Comments
This should work if you include the access key in the URI such as |
Replying to [comment:3 dkocher]:
Good to know, though that will not work for our use case. We are looking to send out or post s3:// URLs for our internal users so they can get access to files. We will not know their access keys so, nor can we send a single email to multiple people and have this work. Please let me know if there's anything else we can do to help out. |
We will have a fix to obtain the AWS access key from the default profile in |
Sounds good. |
I tried this with CyberDuck 6.9.4 and it is not working for me, I'm still being prompted for access key and secret when I click on a valid s3:// URL. I've tried downloading the URL with the "aws s3" command and it works, and I verified that the "[default]" section of ~/.aws/credentials is valid and works with the command line "aws s3" command. |
Using that format does work for me, yay! Unfortunately it does not align with what the AWS python client (https://aws.amazon.com/cli/) uses, and we also use that. Specifically we send out an It'd be nice if CyberDuck and "aws s3" aligned, but that would mean that S3 URI handling in CyberDuck would be different from all the other URIs that you support (and that doesn't seem like a good idea right off). I'll see what I can do here to make something that can tweak the URI and pass it along, that might work for us. |
Ticket retargeted after milestone deleted |
We are looking for an app that will let our users click on s3:// URLs and download the file the URL points to (like s3://bucket/path/file.txt). CyberDuck is almost there, the problem we're running into is that when the user clicks on an s3:// URL, CyberDuck opens a sheet requesting the user's access and secret keys even after those keys have been saved in the user's keychain and in their ~/.aws/credentials file. It would be much better if there were no additional user interaction required and CyberDuck could pull the access and secret keys from the credentials file or from the keychain, best would be from the credentials file with a profile set in the preferences.
Steps to reproduce:
Set up user's account so that they can use "aws s3 cp s3://PATH/TO/FILE /PATH/TO/LOCAL_DIR" to copy files from S3 to local. Nominally this means setting up ~/.aws/config and ~/.aws/credentials properly.
Run CyberDuck and have it save the user's AWS access and secret keys in the Keychain.
Set up a clickable s3:// URL (like s3://PATH/TO/FILE)
Click on the s3:// URL
Expected Results:
CyberDuck downloads the file the s3:// URL points to with no user interaction required.
Actual Results:
CyberDuck opens a sheet on the transfers window requesting the user's access and secret keys.
The text was updated successfully, but these errors were encountered: