Cyberduck Mountain Duck CLI

#3347 closed defect (fixed)

Amazon S3 gives a "Request Error.. Not Found" message on some 'folders'

Reported by: info@… Owned by: dkocher
Priority: normal Milestone: 3.3
Component: s3 Version: 3.2.1
Severity: critical Keywords:
Cc: Architecture:
Platform:

Description (last modified by dkocher)

We use Cyberduck for our Amazon S3 buckets. Since upgrading to 3.2.1 we are getting a message Request Error.. Not Found. trying to open some folders within the bucket. The transcript says...

x-amz-id-2: i8P2jW5UuoHZ5iqm++iCS1TyVvyHI5pPZazXkRGDRsmbF+K7BtlaBuv+zXSKfO63[\r][\n]
Content-Type: application/xml[\r][\n]
Transfer-Encoding: chunked[\r][\n]
Date: Fri, 17 Jul 2009 16:36:56 GMT[\r][\n]
Server: AmazonS3[\r][\n]
[\r][\n]

I switched back to 3.2 and these folders open fine.

Some folders are still ok in 3.2.1. The only thing I can see is that the ones that give the error have lots of files inside them.

Attachments (2)

js_folder.png (124.7 KB) - added by anonymous on Jul 19, 2009 at 1:25:03 AM.
js_folder_desktop.png (158.7 KB) - added by info@… on Jul 19, 2009 at 1:37:48 AM.
Folder listing as downloaded in 3.2

Download all attachments as: .zip

Change History (10)

comment:1 Changed on Jul 18, 2009 at 11:00:32 AM by dkocher

  • Description modified (diff)
  • Summary changed from Amazon S£ gives a "Request Error.. Not Found" message on some 'folders' to Amazon S3 gives a "Request Error.. Not Found" message on some 'folders'

comment:2 Changed on Jul 18, 2009 at 11:07:10 AM by dkocher

Is there a specific naming pattern you can see for these folders or other unique characteristics that may help me to reproduce the problem.

Changed on Jul 19, 2009 at 1:25:03 AM by anonymous

comment:3 Changed on Jul 19, 2009 at 1:37:06 AM by info@…

Here is a table of the folder names, a Y if they open in 3.2.1 an N if they don't and the number of files they contain...

cssN38
dynamicN98
filesY1
flashY1
imgN87
jsN26

The problem seems to be somehow connected to zero size files.

I looked in each of the above folders...

The 'files' folder which works in 3.2.1 contains one sub folder and nothing else. There are subfolders within that folder.

Folder 'flash' which also works on 3.2.1 contains a single .swf file.

All the folders that do not work with 3.2.1 contain a mixture of files AND subfolders. There all also contain some files with names like ajax_$folder$ which are 0B in size. I believe these are 'folder aliases' created by applications other than Cyberduck. We have used Interarchy and S3Fox previously. I've attached a screen shot of the js folder contents called js_folder.

To try and pin down the problem I thought I would download the js folder (being the smallest non working example), upload it with a different name and then delete the odd azero byte files to see if that made any difference in 3.2.1 but I'm actually having some trouble with 3.2

I received some errors, it doesn't seem to like the sub-folders. I got this message...

I/O Error: Cannot read file attributes /mybucket.name.edited/js/ajax S3 GET failed for '/js%2Fajax'

The forward slash between the js and the ajax is being escaped?

I got similar messages for 'ticker' and 'js'. All 3 are zero B files.

However despite the error messages the folder did download (see second screenshot js_folder_desktop)

I uploaded a copy of the folder, calling it js_copy with no errors.

I then opened this newly uploaded js_copy folder in 3.2.1 with no problem, I am still unable to open the original js folder in 3.2.1

Changed on Jul 19, 2009 at 1:37:48 AM by info@…

Folder listing as downloaded in 3.2

comment:4 follow-up: Changed on Jul 24, 2009 at 5:07:09 PM by Paul Willis <info@…>

I've pinned this down to a zero byte file inside folders created with Interarchy. I've tested with the latest version Interarchy 9.0.1/5484 and this is 100% reproducible.

Create a folder in your S3 bucket using Interarchy. Put a normal file inside. In this case 'test.txt'

Launch Cyberduck 3.2.1 open the S3 bucket, try to open the folder. "Request Error.. Not Found."

Using Cyberduck 3.2 open the S3 bucket, open the folder, see the 'test.txt' file. No problems. Notice also a zero byte file with the same name as the parent folder.

Delete the zero byte file, you will need to use S3Fox (or maybe some other S3 apps but not Cyberduck that doesn't seem to be able to delete it).

Now use Cyberduck 3.2.1 again, open the S3 bucket, open the folder, see the 'test.txt' file. No problems.

While I suspect Interarchy is to blame here, this did work in Cyberduck 3.2 but is now broken in Cyberduck 3.2.1

comment:5 Changed on Jul 24, 2009 at 5:09:49 PM by dkocher

  • Milestone set to 3.3
  • Status changed from new to assigned

Thanks very much for the detailed report. I will look into this as soon as possible.

comment:6 in reply to: ↑ 4 Changed on Jul 26, 2009 at 4:49:05 PM by dkocher

Replying to Paul Willis <info@…>:

I've pinned this down to a zero byte file inside folders created with Interarchy.

What Interarchy does is create keys named /bucket/folder/ where the full key is folder/ but if displayed hierarchically the last part of the key is an empty string. Cyberduck always normalizes path references internally therefore this actually points back to the parent key.

comment:7 Changed on Jul 26, 2009 at 5:10:19 PM by dkocher

  • Resolution set to fixed
  • Status changed from assigned to closed

In r4981.

comment:8 Changed on Aug 6, 2009 at 5:38:11 PM by kiddailey

Just FYI for those that stumble on this as I did:

Be warned that you should not use 3.2 to delete the 0byte files causing this issue, or you may loose files'''

I reverted to 3.2 to try and work-around this until the next update and discovered that when I tried to delete the 0byte file out of a folder, Cyberduck 3.2 immediately started deleting ALL files within that folder -- and it's pretty obvious why:

1.) /bucket/myfolder contains a file called "myfolder" that is 0bytes and 3GB of other files ...

2.) Cyberduck issues the command "Delete /buckey/myfolder" in an attempt to delete the 0byte file ...

3.) Poof!

4.) Spend the next few hours re-uploading 1GB worth of files that got wiped before you realized what was happening :)

Note: See TracTickets for help on using tickets.
swiss made software