Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The ~INT_BAYESIAN~ flag does not seem to work #89

Closed
cerebrai opened this issue Jul 25, 2021 · 1 comment
Closed

The ~INT_BAYESIAN~ flag does not seem to work #89

cerebrai opened this issue Jul 25, 2021 · 1 comment
Labels
bug Something isn't working metric-learning

Comments

@cerebrai
Copy link

cerebrai commented Jul 25, 2021

Hi,

I have a custom loss function that has a hyper-parameter that only takes integer values. I used the following command

--loss_funcs~OVERRIDE~ {metric_loss: {MyCustomLoss: {t_k~INT_BAYESIAN~: [0, 10]}}},

however, the first bayesian iteration sets a float value of around 8.3 for t_k. What am I missing here?

Note: As a workaround, I have type-casted t_k to be int in my code but I am afraid it might affect the optimization process.

@KevinMusgrave
Copy link
Owner

I'm not sure why that is happening. However, I think your workaround might be equivalent, according to this issue: facebook/Ax#133

@KevinMusgrave KevinMusgrave added the bug Something isn't working label Jul 25, 2021
@KevinMusgrave KevinMusgrave closed this as not planned Won't fix, can't repro, duplicate, stale Jan 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working metric-learning
Projects
None yet
Development

No branches or pull requests

2 participants