-
Notifications
You must be signed in to change notification settings - Fork 67
Conversation
@@ -59,6 +62,7 @@ run = finetuner.fit( | |||
model_options={}, # additional options to pass to the model constructor | |||
loss='TripletMarginLoss', # Use CLIPLoss for CLIP fine-tuning. | |||
miner='TripletMarginMiner', | |||
miner_options={'margin': 0.2}, # additional options for the miner constructor |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you add optimizer options to here?
To filter the instances in a batch that are used to calculate the loss, you can use miners. | ||
Finetuner allows you to use miners provided by the [Pytorch Metric Learning](https://kevinmusgrave.github.io/pytorch-metric-learning) framework. | ||
To select a specific miner, you can pass its name to the fit function, e.g., `AngularMiner`, `TripletMarginMiner`, ... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can list all miners to the developer reference field, all documented in docstring
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
First I had this, but it looked ugly, because the names are so long and therefore, the list is very long.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
📝 Docs are deployed on https://ft-feat-options-for-miner-and-optimizer--jina-docs.netlify.app 🎉 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good!
Adds support for miner and optimizer options and documentation about optimizers and miners.
Addresses #299 and #300