-
-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: exclude EFFECTS and COLLAGE from duplicated images #111
Comments
Thank you for this report. Maybe, I should add a way to get the list of duplicates. The duplicate command intend was to eliminate the duplicates from the same photos due to the google photo compression. You have the original photo at full resolution coming from immich app, and the compressed version coming from the google takeout. immich server accepts both because their SHA1 hash are different. Files are stored in immich as IMAGE.jpg and IMAGE+1.jpg, and both have the same name in the UI. The duplicate command detects 2 files with the same date, and the same visible name, but with a different size as duplicate. The bigger file is kept. So, you files are effectively detected as duplicates. Your suggestion is good. |
@gvillo I have tested with collages done in GP web page. But they have slightly different names.
So the duplicate command doesn't detect them as duplicate. How did you come with this situation? |
@simulot sorry for the delay! the issue is when files are named |
Lot of changes done since them. |
After running duplicate command I found there is no dry-run param, it would be nice to see a list of files that is going to be deleted before going one by one.
On the other hand, I found Google Photos is generating COLLAGE or EFFECTS files with the same filename with a
+#
suffix, e.g.These files are not duplicates, they are all generated in the same time, same filename, but different images and I would like to keep those files. I might be wrong and this is not generated by Google Photos, but if we can provide some sort of exclude file list too it would be awesome.
The text was updated successfully, but these errors were encountered: