Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add hyper-parameter tuning with optuna #65

Merged
merged 9 commits into from
Jun 20, 2022
Merged

Add hyper-parameter tuning with optuna #65

merged 9 commits into from
Jun 20, 2022

Conversation

juanmc2005
Copy link
Owner

This PR addresses issues #53 and #62.

Changelog

  • Add diart.optim.Optimizer to tune hyper-parameters given a basic configuration object
    • By default, Optimizer creates a basic optuna study, but it can also work with a study created by the user
    • Optimizer is also compatible with optuna's distributed optimization
  • Add HyperParameter data class to represent a tuneable hyper-parameter
  • Add TauActive, RhoUpdate and DeltaNew instances of HyperParameter so they don't have to be created from scratch
  • Add diart.tune script to quickly tune the default pipeline to a dataset (also compatible with distributed optimization)
  • diart.stream, diart.benchmark and diart.tune can be run without python -m
  • Make it easier to create a model block with SpeakerEmbedding.from_pyannote and SpeakerSegmentation.from_pyannote
  • Separate verbose parameter in Benchmark into show_progress and show_report
  • Make batch_size a constructor parameter in Benchmark
  • Add menu to README
  • Add custom embedding model example in README
  • Add hyper-parameter tuning section to README

@juanmc2005 juanmc2005 added feature New feature or request API Improvements to the API refactoring Internal design improvements that don't change the API labels Jun 17, 2022
@juanmc2005 juanmc2005 added this to the Version 0.4 milestone Jun 17, 2022
@juanmc2005 juanmc2005 self-assigned this Jun 17, 2022
@juanmc2005 juanmc2005 merged commit 676b2b5 into develop Jun 20, 2022
@juanmc2005 juanmc2005 deleted the feat/optim branch June 20, 2022 10:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
API Improvements to the API feature New feature or request refactoring Internal design improvements that don't change the API
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant