Skip to main content

Questions tagged [duplicate-files]

Filter by
Sorted by
Tagged with
2 votes
0 answers
90 views

I have a 4 TB hard drive containing pictures, sounds, and videos from the last 15 years. These files were copied onto this drive from various sources, including hard drives, cameras, phones, CD-ROMs, ...
Bernd Kunze's user avatar
0 votes
1 answer
596 views

As the title suggests, I'm looking to check a bunch of files on a Linux system, and keep only one of each hash. For the files, the filename is irrelevant, the only important part is the hash itself. I ...
AeroMaxx's user avatar
  • 227
1 vote
1 answer
177 views

I am using fdupes to print the list of duplicate files in a certain folder, with the option --recurse. However, since I am using macOS, the recursing process regards Mac apps (which appear to be ...
Jinwen's user avatar
  • 113
1 vote
3 answers
1k views

I often have folders with different names but the same content. For example, I copy a folder to another location, for ease of access, and then I forget to delete the copy. How can I detect the ...
Ilario Gelmetti's user avatar
0 votes
1 answer
150 views

When I do a directory listing of a python installation the include directory appears twice and each one has a different inode. ╰─○ ls -i1 2282047 bin 2641630 include 2642559 include 2282048 lib ...
vfclists's user avatar
  • 7,919
0 votes
1 answer
968 views

I am working on de-cluttering a company shared drive, and looking to remove duplicates. Is there any duplicate finding program that allows you to specify which directory's duplicates are to be removed?...
George Coffey's user avatar
2 votes
1 answer
478 views

I have a folder of images that contain quite a bit of duplicates, I'd like to remove all duplicates except for one. Upon Googling I found this clever script from this post that succinctly does almost ...
cdouble.bhuck's user avatar
4 votes
2 answers
3k views

I have two locations /path/to/a and /path/to/b. I need to find duplicate files in both paths and remove only items in /path/to/b items. rmlint generates quite a large removal script, but it contains ...
ylluminate's user avatar
1 vote
1 answer
900 views

In exploring options to merge two folders, I've come across a very powerful tool known as rmlint. It has some useful documentation (and Gentle Guide). I have a scenario that I previously mentioned and ...
ylluminate's user avatar
2 votes
2 answers
691 views

(N.B. There are many similar questions (e.g. here, here, here, and here) but they either assume that the directory structure is one-deep, or the answers are more complex multi-line scripts.) This is ...
dumbledad's user avatar
  • 121
1 vote
0 answers
556 views

I have a number of folders with my various media (e.g. photos, music) from different points in time. The different folders have some of the same content (e.g. a photo might be in 2 folders), but ...
Vasu's user avatar
  • 111
0 votes
1 answer
83 views

I am looking to take files from directory B and copy them into directory A. However, I know that are some same named files in each. I would only like to have only the "non-matching" file names copied ...
stormctr2's user avatar
1 vote
3 answers
6k views

I have a number of folders with a few million files (amounting to a few TB) in total. I wish to find duplicates across all files. The output ideally is a simple list of dupes - I will process them ...
Ned64's user avatar
  • 9,296
0 votes
2 answers
2k views

I've got 2 very old Backups of a friends computer. They were simply copied into a folder each on an external Harddrive. Both are about 300GB in Size and the contents are very much alike but not ...
Turtlepurple's user avatar
0 votes
0 answers
78 views

tough to find information to my idea as people normally look forward to find and remove duplicates not vice versa. I have an application running to control a heating system. It uses a file based ...
Jan S's user avatar
  • 57
1 vote
2 answers
1k views

I'm trying to use fslint to find duplicates, but it takes forever hashing entire multi-gigabyte files. According to this website, I can compare by the following features: feature summary compare by ...
SurpriseDog's user avatar
3 votes
0 answers
2k views

I'm looking for a method that reliably gives me a "mathematical distance" of two videos from one another. Similar to how the Levenshtein distance can be used to get the distance from a ...
What's user avatar
  • 366
0 votes
1 answer
198 views

How am I able to search for duplicate files that are zipped and unzipped, with the same name? I understand I can do the initial search with the below however, not sure how to pipe in some duplicate ...
kissland's user avatar
0 votes
1 answer
704 views

I Have duplicate file i need count of that files. Example: example.html example(1).html Output: i want the number of counts for example(1).htnl
Amrut Nadgiri's user avatar
-4 votes
1 answer
381 views

No fdup please..I want to make a script. I have a lot of file duplicates,there are more than 200 I made ( is under construction ) a bash script which make md5sum on every file,then with uniq i put ...
elbarna's user avatar
  • 14.5k
102 votes
3 answers
162k views

I found this command used to find duplicated files but it was quite long and made me confused. For example, if I remove -printf "%s\n", nothing came out. Why was that? Besides, why have they ...
The One's user avatar
  • 5,142
30 votes
5 answers
15k views

I'm trying to find a way to check inside a given directory for duplicate files (even with different names) and replace them with symlinks pointing to the first occurrence. I've tried with fdupes but ...
Sekhemty's user avatar
  • 934
122 votes
14 answers
86k views

Is it possible to find duplicate files on my disk which are bit to bit identical but have different file-names?
student's user avatar
  • 18.9k
22 votes
8 answers
16k views

I there a way to find all files in a directory with duplicate filenames, regardless of the casing (upper-case and/or lower-case)?
lamcro's user avatar
  • 923
210 votes
20 answers
79k views

I'm looking for an easy way (a command or series of commands, probably involving find) to find duplicate files in two directories, and replace the files in one directory with hardlinks of the files in ...
Josh's user avatar
  • 8,778