You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear author:
This is a very great job and I try to apply your algorithm on my own dataset. My dataset consists of 3605 objects and the dimension is 100. Due to the characteristic of the task, both the dataset and the query set are the same for the top 10 nearest neighbor searches.
However, while the QALSH could run successfully, the QALSH_PLUS appears an error :
According to your article, each leaf of the KD-Tree is regarded as a block and I check the parameter num_blocks_ and find out the value is 1 but the iteration of nb is greater than 1.
Thus my question is if there is any solution to adjust the programme so that the data could be split to generate more num_blocks_ to make the script runs successfully? I have adjusted the input parameters such as leaf , B , 'c' or 'p' but none of them seems to be useful.
I would be so appreciated if you could reply. Thanks a lot!
The text was updated successfully, but these errors were encountered:
Dear author:
This is a very great job and I try to apply your algorithm on my own dataset. My dataset consists of 3605 objects and the dimension is 100. Due to the characteristic of the task, both the dataset and the query set are the same for the top 10 nearest neighbor searches.
However, while the QALSH could run successfully, the QALSH_PLUS appears an error :
According to your article, each leaf of the KD-Tree is regarded as a block and I check the parameter
num_blocks_
and find out the value is 1 but the iteration of nb is greater than 1.Thus my question is if there is any solution to adjust the programme so that the data could be split to generate more
num_blocks_
to make the script runs successfully? I have adjusted the input parameters such asleaf
,B
, 'c' or 'p' but none of them seems to be useful.I would be so appreciated if you could reply. Thanks a lot!
The text was updated successfully, but these errors were encountered: