You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Would it be possible to implement the following features:
scan more than one folder , encode the found pictures and save the encodings inclusive path/filename - so this time consuming task has to be done only once
save the list in a format it can be read back on start / or put it into a DB - so the efforts already done are saved
compare the database against a single picture encoding with given thresholds. This would enable a way faster processing of single pictures or a folder with new pictures. New pictures in the "to process folder" - oder pictures no longer available as they are present in the DB.
As for now and i hope i have understand this correct - the encoding results encodings=method_object.encode_images('foldername') can not be used in find_duplicates_to_remove or find_duplicates. Of course - once the original images are no longer present - an plot will not be possible, but as the hashes are known - new files can be checked against the hashes and marked as duplicates.
converted the picture/hash arrays to json and saved them - reload and reconverted it to ndarray - but how to use it to calculate duplicates and check against new files?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Would it be possible to implement the following features:
As for now and i hope i have understand this correct - the encoding results encodings=method_object.encode_images('foldername') can not be used in find_duplicates_to_remove or find_duplicates. Of course - once the original images are no longer present - an plot will not be possible, but as the hashes are known - new files can be checked against the hashes and marked as duplicates.
converted the picture/hash arrays to json and saved them - reload and reconverted it to ndarray - but how to use it to calculate duplicates and check against new files?
Beta Was this translation helpful? Give feedback.
All reactions