Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lightning implementation #230

Closed
34j opened this issue Apr 4, 2023 · 1 comment · Fixed by #246
Closed

Lightning implementation #230

34j opened this issue Apr 4, 2023 · 1 comment · Fixed by #246
Labels
enhancement New feature or request

Comments

@34j
Copy link
Collaborator

34j commented Apr 4, 2023

Is your feature request related to a problem? Please describe.
Training code should be refactored.

Describe the solution you'd like

Additional context
I read the code carefully, but PytorchLightning has no easy way to specify non-zero initial epochs and global_steps. (Admittedly, such a key exists at model save time, but it is not used.) It is very difficult to resume training from an existing model structure. However, global_steps and epochs are used everywhere, the only way may be to specify it with dirty code.

@34j 34j added the enhancement New feature or request label Apr 4, 2023
@34j 34j linked a pull request Apr 7, 2023 that will close this issue
@34j 34j changed the title PytorchLightning implementation Lightning implementation Apr 7, 2023
@34j
Copy link
Collaborator Author

34j commented Apr 8, 2023

pytorch/xla#2241

@34j 34j closed this as completed in #246 Apr 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant