pkolaczk / fclones

Efficient Duplicate File Finder

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

group output according to order on command line

adrium opened this issue · comments

The tool rdfind gives a higher priority to keep files according to the order on the command line. This is very useful to remove duplicates in decades old and unstructured manual backup folders. Consider the following folders with Pictures containing roughly the same files:

  1. Data/Pictures
  2. Backup-2024/Pictures
  3. Backup-2022/Pictures
  4. Drive-D/Data/Pictures
  5. Data/OldBackups/Drive-D/Data/Pictures

Imagine the above list is sorted by most recent files.

  • Files in directory 1 should be kept and removed from the other folders.
  • If a file only is in directories 2 and 3, it should be kept in 2 only.
  • For prior duplicate removal within only directory 1, for example --priority least-nested could be used.

From my understanding of experimenting with fclones (with dry-run) with a structure similar to the above, the output of the group command would have to be sorted. However, this does not seem to be the case currently.