Duplicate finder help or alternatives
Duplicate finder help or alternatives
I want to compare 2 drives. I just want to find the dupes that exist on both drives, not the dupes that exist on same drive.
There will be hundreds of files in a folder that will be duplicated, but the folder names and paths may be different. It's way too laborious to go through each one. I need to be able to see that the file on X: has the same name and size as the one on Y: and delete all of the duped files on Y.
All of the de-dupers I've looked at have the same paradigm for looking at the dupes: a filelist with sortable columns. What I need is more like free file sync or syncback, where I can specify a master drive where nothing is deleted.
Any suggestions (or ways I can wrangle everything to do what I need?)
There will be hundreds of files in a folder that will be duplicated, but the folder names and paths may be different. It's way too laborious to go through each one. I need to be able to see that the file on X: has the same name and size as the one on Y: and delete all of the duped files on Y.
All of the de-dupers I've looked at have the same paradigm for looking at the dupes: a filelist with sortable columns. What I need is more like free file sync or syncback, where I can specify a master drive where nothing is deleted.
Any suggestions (or ways I can wrangle everything to do what I need?)
Re: Duplicate finder help or alternatives
Everything doesn't have an option to compare folders yet.
A compare folder feature is in development.
Thank you for the suggestion.
A compare folder feature is in development.
Thank you for the suggestion.
Re: Duplicate finder help or alternatives
That's awesome
Fyi I found an app that has the sort of UI I was talking about, if you're interested. (But it lacks the mft speed of everything!) Called beyond compare. It has the option to ignore folder structure but can stil be sorted by path so it's super easy to see if there's an entire folder of duplicates.
Fyi I found an app that has the sort of UI I was talking about, if you're interested. (But it lacks the mft speed of everything!) Called beyond compare. It has the option to ignore folder structure but can stil be sorted by path so it's super easy to see if there's an entire folder of duplicates.
Re: Duplicate finder help or alternatives
For a dir compare or sync there is no real need for the Everything speed.getho wrote: ↑Mon Feb 27, 2023 7:42 am That's awesome
Fyi I found an app that has the sort of UI I was talking about, if you're interested. (But it lacks the mft speed of everything!) Called beyond compare. It has the option to ignore folder structure but can stil be sorted by path so it's super easy to see if there's an entire folder of duplicates.
Most file managers have a dir compare or sync function.
I use Total Commander and sometimes Beyond Compare inside of TC for this.
Re: Duplicate finder help or alternatives
It's not a dir sync. Read my OP. Its looking for duplicate files (in my case comparing 2 24tb volumes)
Re: Duplicate finder help or alternatives
Still the same comment
Doing that on 1TB volume with Total commander and speed is not the problem.
Btw. TC uses Everything for its searches.
More importand are the options what to do with the duplicates found and the handling
of how to select which ones should be left.
For me the handling of such operations are much to complicated in Everything in the current version.
______________________________________________________
Windows 11 Home x64 Version 22H2 (OS Build 22621.1265)
Everything 1.5.0.1338a (x64)
Everything Toolbar 1.0.3.0
TC 10.52
Re: Duplicate finder help or alternatives
No idea if it would work, but...
<C: | E:> dupe:name;size
save that to a filelist, duplist
break that duplist into 2 lists, clist & elist
clist has only C: items, dlist has only D: items
(grep "^\"C:" duplist > clist
grep "^\"D:" duplist > dlist)
load clist
<C:> unique:name;size
save that as clist2
load elist
<E:> unique:name;size
save that as elist2
merge those 2 lists together, c2+e2[ list ]
open that in Everything
dupe:name;size
with that, any E:'s would be dup'd in name/size to something on C:
sort by Path, select all your E:'s...
?
Again, no idea if this would pan out, so test, test, test again, & then verify, a few times...
<C: | E:> dupe:name;size
save that to a filelist, duplist
break that duplist into 2 lists, clist & elist
clist has only C: items, dlist has only D: items
(grep "^\"C:" duplist > clist
grep "^\"D:" duplist > dlist)
load clist
<C:> unique:name;size
save that as clist2
load elist
<E:> unique:name;size
save that as elist2
merge those 2 lists together, c2+e2[ list ]
open that in Everything
dupe:name;size
with that, any E:'s would be dup'd in name/size to something on C:
sort by Path, select all your E:'s...
?
Again, no idea if this would pan out, so test, test, test again, & then verify, a few times...
Re: Duplicate finder help or alternatives
The problem with FFS (probably syncback too ?) is that they deal with directory pairs, not random files strewn about, wherever.free file sync or syncback
Re: Duplicate finder help or alternatives
So with a name+size search, it wasn't as bad as I would have thought.
2 minutes.
And straight forward to select the wanted "dups" (based on name+size - ONLY).
Name & Size search
Compare only between difference source folders (those being C: & E:)
Set C: to "Protected" (so it won't allow me to delete anything from C:)
Once files found
Select all name+size dups on E:
The it's simply a matter of telling it to delete them .
.
.
(Now, do realize that with my overly broad search, Name+Size is hardly sufficient for determining what should or should not be deleted.)
2 minutes.
And straight forward to select the wanted "dups" (based on name+size - ONLY).
Name & Size search
Compare only between difference source folders (those being C: & E:)
Set C: to "Protected" (so it won't allow me to delete anything from C:)
Once files found
Select all name+size dups on E:
The it's simply a matter of telling it to delete them .
Code: Select all
02/28/2023 02:55:54 PM - AllDup 4.5.31 PE
02/28/2023 02:55:54 PM - Search method: File name + File size
02/28/2023 02:55:54 PM - Comparison method: Compare all characters of a file name
02/28/2023 02:55:54 PM - 1.Source folder: C:
02/28/2023 02:55:54 PM - 2.Source folder: E:
02/28/2023 02:55:54 PM - Option: Compare only files between different source folders
02/28/2023 02:55:54 PM - Folder filter activated: 7
02/28/2023 02:55:54 PM - Filter type: Exclusive
02/28/2023 02:55:54 PM - 1.folder filter: e:\windows
02/28/2023 02:55:54 PM - 2.folder filter: e:\program files (x86)
02/28/2023 02:55:54 PM - 3.folder filter: e:\program files
02/28/2023 02:55:54 PM - 4.folder filter: ?:\system volume information
02/28/2023 02:55:54 PM - 5.folder filter: ?:\recycled
02/28/2023 02:55:54 PM - 6.folder filter: ?:\recycler
02/28/2023 02:55:54 PM - 7.folder filter: ?:\$recycle.bin
02/28/2023 02:55:54 PM - Determine file count of all source folders...
02/28/2023 02:56:39 PM - File count: 902,173
02/28/2023 02:56:39 PM - Scan: C:
02/28/2023 02:57:39 PM - Folders filtered: 2
02/28/2023 02:57:39 PM - Scan: E:
02/28/2023 02:57:43 PM - Folders filtered: 5
02/28/2023 02:57:43 PM - Found 11,238 duplicates with a total of 8.02 GB inside folder 'C:'
02/28/2023 02:57:43 PM - Found 2,517 duplicates with a total of 6.26 GB inside folder 'E:'
02/28/2023 02:57:44 PM - Scanned files: 902,173
02/28/2023 02:57:44 PM - Groups: 1,544
02/28/2023 02:57:44 PM - File comparison count: 6,141,452
02/28/2023 02:57:44 PM - Duplicates: 13,755 (1%) (14.28 GB)
02/28/2023 02:57:44 PM - Elapsed time: 00:01:49
(Now, do realize that with my overly broad search, Name+Size is hardly sufficient for determining what should or should not be deleted.)