Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TrainOps + Two Stage Optim for depparse #1337

Merged
merged 3 commits into from
Jan 31, 2024
Merged

TrainOps + Two Stage Optim for depparse #1337

merged 3 commits into from
Jan 31, 2024

Commits on Jan 31, 2024

  1. Implements 2 stage optimization for dependency parsing

    1) save optimizer in checkpoint 2) two stage tracking using args
    
    Save the checkpoints after switching the optimizer, if applicable, so that reloading uses the new optimizer once it has been created
    Jemoka authored and AngledLuffa committed Jan 31, 2024
    Configuration menu
    Copy the full SHA
    0d53822 View commit details
    Browse the repository at this point in the history
  2. Add a short test method that a single optimizer case saves checkpoint…

    …s and the checkpoints are loadable
    
    Add a flag which forces the optimizer to switch after a certain number of steps - useful for writing tests which check the behavior of the second optimizer
    AngledLuffa committed Jan 31, 2024
    Configuration menu
    Copy the full SHA
    207a0d4 View commit details
    Browse the repository at this point in the history
  3. Put the global_step and dev score history into the model files so tha…

    …t when a checkpoint gets loaded, the training continues from the position it was formerly at rather than restarting from 0
    
    Report some details of the model being loaded after loading it
    AngledLuffa committed Jan 31, 2024
    Configuration menu
    Copy the full SHA
    7bc8cc9 View commit details
    Browse the repository at this point in the history