Cyberduck Mountain Duck CLI

#9585 closed enhancement (fixed)

Deleting multiple files concurrently

Reported by: drsassafras Owned by: dkocher
Priority: normal Milestone: 5.0.3
Component: sftp Version: 5.0
Severity: normal Keywords:
Cc: Architecture:
Platform:

Description

I have been deleting some large subdirectories (10,000+ files) and it takes SO LONG! Instead of the current way cyberduck deletes files, can it just send a unix delete command (rm -rf)

I figure it might work where archives can be expanded.

Thanks

Change History (6)

comment:1 Changed on Jun 6, 2016 at 8:11:54 AM by dkocher

  • Component changed from core to sftp
  • Owner set to dkocher
  • Summary changed from Deleting takes FOREVER with large subdirectories to Deleting takes long with large subdirectories

Unfortunately there is no recursive option available in the SFTP protocol.

comment:2 Changed on Jun 6, 2016 at 8:12:52 AM by dkocher

But we might want to implement concurrent requests executed in parallel when deleting multiple files. We currently do this for Backblaze B2.

comment:3 Changed on Jun 7, 2016 at 1:18:34 AM by drsassafras

That would probably work really well. Deleting files is mostly a handshake, not much actual data is being sent. 5-10 concurrent deleting attempts would likely not stress an internet connection, but would make a 10hr deleting process take 1hr to 30 minues!

Version 0, edited on Jun 7, 2016 at 1:18:34 AM by drsassafras (next)

comment:4 Changed on Jun 7, 2016 at 1:52:25 PM by dkocher

  • Milestone set to 5.0.2
  • Resolution set to fixed
  • Status changed from new to closed

In r20841.

comment:5 Changed on Jun 13, 2016 at 2:44:59 PM by dkocher

  • Milestone changed from 5.0.2 to 5.0.3
  • Summary changed from Deleting takes long with large subdirectories to Deleting multiple files concurrently

comment:6 Changed on Jun 13, 2016 at 3:25:45 PM by dkocher

Revised fix in r20891.

Note: See TracTickets for help on using tickets.
swiss made software