You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
On the line # 31 and # 32 of FreeDiscovery/examples/engine/duplicate_detection.py file
It says To use a custom dataset, simply specify the following variables
data_dir = input_ds['metadata']['data_dir']
Please let me know how can I inject my datasets before detecting duplicates?
I don't want to use this, I want to use my source of dataset because I want to find duplicates of my Elasticsearch documents
dataset_name = "treclegal09_2k_subset" # see list of available datasets
The text was updated successfully, but these errors were encountered:
On the line # 31 and # 32 of FreeDiscovery/examples/engine/duplicate_detection.py file
It says To use a custom dataset, simply specify the following variables
data_dir = input_ds['metadata']['data_dir']
Please let me know how can I inject my datasets before detecting duplicates?
I don't want to use this, I want to use my source of dataset because I want to find duplicates of my Elasticsearch documents
dataset_name = "treclegal09_2k_subset" # see list of available datasets
The text was updated successfully, but these errors were encountered: