You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As indicated by both the paper (section S-1) and the help argument, the default value should be 0.02 which is not the case. Were the experiments performed with 0.02 or 0.001?
The text was updated successfully, but these errors were encountered:
Hi, thanks for spotting these typos! I have fixed these in the new commit. For the distill_lr option, we use a value between 0.02 and 0.001 depending on the task. For basic setting, we have 0.001, so I am setting 0.001 as the new default value. I'll update the arxiv pdf soon.
Regarding some of the default arguments set in base_options.py:
https://github.com/SsnL/dataset-distillation/blob/f749262ca2dbd929a07b912cf271c76c0e6e378e/base_options.py#L247-L248
Shouldn't the default value be one of the options? An error is thrown out when using the value
charge
https://github.com/SsnL/dataset-distillation/blob/f749262ca2dbd929a07b912cf271c76c0e6e378e/base_options.py#L249-L250
As indicated by both the paper (section S-1) and the
help
argument, the default value should be 0.02 which is not the case. Were the experiments performed with 0.02 or 0.001?The text was updated successfully, but these errors were encountered: