Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Freeze upper layers of the network for retraining. #355

Open
adamltyson opened this issue May 1, 2020 · 0 comments
Open

[Feature] Freeze upper layers of the network for retraining. #355

adamltyson opened this issue May 1, 2020 · 0 comments
Labels
enhancement New feature or request

Comments

@adamltyson
Copy link
Member

Linked to #354. Allowing upper layers to be frozen when retraining should make slight tweaks easier

@adamltyson adamltyson added the enhancement New feature or request label May 1, 2020
@adamltyson adamltyson self-assigned this May 1, 2020
@adamltyson adamltyson removed their assignment Feb 14, 2022
@willGraham01 willGraham01 transferred this issue from brainglobe/cellfinder Dec 13, 2023
@adamltyson adamltyson transferred this issue from brainglobe/brainglobe-workflows Jan 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Backlog
Development

No branches or pull requests

1 participant