-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add set_regularization #54
Conversation
Hi, @Tyler-D |
OK. There is no regularizer for Batchnorm layer. I can add it for optional argument (Though I haven't use this kind of regularization in my work). I checked the total loss produced by Keras, which includes softmax loss and all the layer's regularization loss. Also I tracked the softmax loss only through metrics option. And before setting regularizers, total loss = softmax loss; after setting regularizers, total loss > softmax loss. AFAIK, Keras cannot track regularization loss separately (The model adds regularization loss layer by layer during compile). I'm not sure how to build up a test for this function. It seems like to test the Keras' functionality. Any suggestion ? |
I come up with following test: ...
x = np.ones((1,32,32,3))
y = np.ones((1,32,32,1))
model = ... # load any kind of model
new_model = set_regularization(model, l2(0.01))
# define loss function that return 0
def my_loss(gt, pr)
return pr * 0
model.compile('Adam', loss=my_loss, metrics=['binary_accuracy'])
new_model.compile('Adam', loss=my_loss, metrics=['binary_accuracy'])
loss_1, _ = model.test_on_batch(x, y)
loss_2, _ = new_model.test_on_batch(x, y)
assert loss_1 == 0.
assert loss_2 > 0. I would be happy if you create |
tests/test_utils.py
Outdated
|
||
CASE = ((X1, Y1, Unet('resnet18'))) | ||
(X1, Y1, MODEL), | ||
) | ||
|
||
|
||
def _test_regularizer(model, reg_model, x, y): | ||
|
||
def zero_loss(pr, gt): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def zero_loss(gt, pr):
segmentation_models/utils.py
Outdated
"""Set regularizers to all layers | ||
|
||
Note: | ||
Returned model's config is upated correctly |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
segmentation_models/utils.py
Outdated
bias_regularizer(``regularizer``): regularizer of bias | ||
activity_regularizer(``regularizer``): regularizer of activity | ||
gamma_regularizer(``regularizer``): regularizer of gamma of BatchNormalization | ||
beta_regularizer(``regularizer``): regularizer of bata of BatchNormalization |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bata -> beta
Hi @qubvel, despite the typo you point out, do you have any idea about the error produced by CI:
I ran it correctly in my own environments: |
Hi @Tyler-D |
That does not help :( I will try to test it on my local pc |
Add set_regularization function. The config and model itself is updated correctly.
The train loss includes softmax loss and regularization loss in my train scripts.