Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated keras training documentation to include custom losses #892

Merged

Conversation

@esslushy
Copy link
Contributor

commented Jul 31, 2019

Just added how to write custom losses in keras as I understand them from my exploration of the code. I am welcome to any feedback or any mistakes I might have made.

Have a great day.

@esslushy esslushy requested review from brilee, lamberta and MarkDaoust as code owners Jul 31, 2019

@googlebot googlebot added the cla: yes label Jul 31, 2019

@tfdocsbot

This comment has been minimized.

Copy link
Collaborator

commented Jul 31, 2019

Preview and run these notebook edits with Google Colab:

Notebook diffs available on ReviewNB.com.

@lamberta lamberta requested a review from yashk2810 Jul 31, 2019

"colab": {}
},
"source": [
"from sklearn.metrics import zero_one_loss # Any function that takes in y_true and y_pred as their only parameters work\n",

This comment has been minimized.

Copy link
@lamberta

lamberta Jul 31, 2019

Member

Notebook is breaking here:

TypeError                                 Traceback (most recent call last)
<ipython-input-9-58d0afc9e0c3> in <module>()
      2 
      3 model.compile(optimizer=keras.optimizers.RMSprop(learning_rate=1e-3),
----> 4               loss=zero_one_loss)
      5 model.fit(x_train, y_train,
      6           batch_size=64,

11 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
    456     self._self_setattr_tracking = False  # pylint: disable=protected-access
    457     try:
--> 458       result = method(self, *args, **kwargs)
    459     finally:
    460       self._self_setattr_tracking = previous_value  # pylint: disable=protected-access

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py in compile(self, optimizer, loss, metrics, loss_weights, sample_weight_mode, weighted_metrics, target_tensors, distribute, **kwargs)
    335 
    336       # Creates the model loss and weighted metrics sub-graphs.
--> 337       self._compile_weights_loss_and_weighted_metrics()
    338 
    339       # Functions for train, test and predict will

/usr/local/lib/python3.6/dist-packages/tensorflow/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
    456     self._self_setattr_tracking = False  # pylint: disable=protected-access
    457     try:
--> 458       result = method(self, *args, **kwargs)
    459     finally:
    460       self._self_setattr_tracking = previous_value  # pylint: disable=protected-access

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py in _compile_weights_loss_and_weighted_metrics(self, sample_weights)
   1492       #                   loss_weight_2 * output_2_loss_fn(...) +
   1493       #                   layer losses.
-> 1494       self.total_loss = self._prepare_total_loss(masks)
   1495 
   1496   def _prepare_skip_target_masks(self):

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py in _prepare_total_loss(self, masks)
   1552 
   1553           if hasattr(loss_fn, 'reduction'):
-> 1554             per_sample_losses = loss_fn.call(y_true, y_pred)
   1555             weighted_losses = losses_utils.compute_weighted_loss(
   1556                 per_sample_losses,

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/losses.py in call(self, y_true, y_pred)
    213       Loss values per sample.
    214     """
--> 215     return self.fn(y_true, y_pred, **self._fn_kwargs)
    216 
    217   def get_config(self):

/usr/local/lib/python3.6/dist-packages/sklearn/metrics/classification.py in zero_one_loss(y_true, y_pred, normalize, sample_weight)
    936     score = accuracy_score(y_true, y_pred,
    937                            normalize=normalize,
--> 938                            sample_weight=sample_weight)
    939 
    940     if normalize:

/usr/local/lib/python3.6/dist-packages/sklearn/metrics/classification.py in accuracy_score(y_true, y_pred, normalize, sample_weight)
    174 
    175     # Compute accuracy for each possible representation
--> 176     y_type, y_true, y_pred = _check_targets(y_true, y_pred)
    177     check_consistent_length(y_true, y_pred, sample_weight)
    178     if y_type.startswith('multilabel'):

/usr/local/lib/python3.6/dist-packages/sklearn/metrics/classification.py in _check_targets(y_true, y_pred)
     69     y_pred : array or indicator matrix
     70     """
---> 71     check_consistent_length(y_true, y_pred)
     72     type_true = type_of_target(y_true)
     73     type_pred = type_of_target(y_pred)

/usr/local/lib/python3.6/dist-packages/sklearn/utils/validation.py in check_consistent_length(*arrays)
    199     """
    200 
--> 201     lengths = [_num_samples(X) for X in arrays if X is not None]
    202     uniques = np.unique(lengths)
    203     if len(uniques) > 1:

/usr/local/lib/python3.6/dist-packages/sklearn/utils/validation.py in <listcomp>(.0)
    199     """
    200 
--> 201     lengths = [_num_samples(X) for X in arrays if X is not None]
    202     uniques = np.unique(lengths)
    203     if len(uniques) > 1:

/usr/local/lib/python3.6/dist-packages/sklearn/utils/validation.py in _num_samples(x)
    150             return x.shape[0]
    151         else:
--> 152             return len(x)
    153     else:
    154         return len(x)

TypeError: object of type 'Tensor' has no len()
@esslushy

This comment has been minimized.

Copy link
Contributor Author

commented Jul 31, 2019

Thanks for pointing that out, I will see what the issue is. I think I might have missed something since I didn't just copy and rewrote it. Let me see what I did to get around that. Thank you.

@googlebot

This comment has been minimized.

Copy link

commented Jul 31, 2019

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.
In order to pass this check, please resolve this problem and have the pull request author add another comment and the bot will run again. If the bot doesn't comment, it means it doesn't think anything has changed.

ℹ️ Googlers: Go here for more info.

@googlebot googlebot added cla: no and removed cla: yes labels Jul 31, 2019

@esslushy

This comment has been minimized.

Copy link
Contributor Author

commented Jul 31, 2019

My memory failed me on the sklearn loss and I just realized I forgot to pass my variables in my loss function. Sorry for my mistakes. It also appears I used the wrong email for cla. I will correct this asap

@lamberta

This comment has been minimized.

Copy link
Member

commented Jul 31, 2019

Thanks. I can manually verify the CLA so no problem.

@yashk2810 can you take a look to make sure this notebook is on the right track?

"colab": {}
},
"source": [
"def dumb_loss_function(y_true, y_pred):\n",

This comment has been minimized.

Copy link
@yashk2810

yashk2810 Jul 31, 2019

Member

basic_loss_function instead of dumb_loss_function?

},
"source": [
"class WeightedBinaryCrossEntropy(keras.losses.Loss):\n",
" \n",

This comment has been minimized.

Copy link
@yashk2810

yashk2810 Jul 31, 2019

Member

Add comments about the arguments used.

Args:
pos_weight: ...
weight: ...

"\n",
" def call(self, y_true, y_pred):\n",
" if not self.from_logits:\n",
" with tf.name_scope('Weighted_Cross_Entropy'):\n",

This comment has been minimized.

Copy link
@yashk2810

yashk2810 Jul 31, 2019

Member

The name_scope is not required

" def call(self, y_true, y_pred):\n",
" if not self.from_logits:\n",
" with tf.name_scope('Weighted_Cross_Entropy'):\n",
" # Manually calculated the weighted cross entropy. Formula is qz * -log(sigmoid(x)) + (1 - z) * -log(1 - sigmoid(x)) where z are labels, x is logits, and q is the weight.\n",

This comment has been minimized.

Copy link
@yashk2810

yashk2810 Jul 31, 2019

Member

Manually calculate

"def dumb_loss_function(y_true, y_pred):\n",
" return tf.math.reduce_mean(y_true-y_pred)\n",
"\n",
"model.compile(optimizer=keras.optimizers.RMSprop(learning_rate=1e-3),\n",

This comment has been minimized.

Copy link
@yashk2810

yashk2810 Jul 31, 2019

Member

Why not use Adam?

@esslushy

This comment has been minimized.

Copy link
Contributor Author

commented Aug 1, 2019

I have made the changes suggested @yashk2810. Thank you all for the help and suggestions. I am still new to pull requests so thank you for your patience.

" return tf.math.reduce_mean(y_true-y_pred)\n",
"\n",
"model.compile(optimizer=keras.optimizers.RMSprop(learning_rate=1e-3),\n",
" loss=dumb_loss_function)\n",
"model.compile(optimizer=keras.optimizers.Adam(learning_rate=1e-3),\n",

This comment has been minimized.

Copy link
@yashk2810

yashk2810 Aug 1, 2019

Member

No need for the learning rate argument. 1e-3 is the default

" # Use built in function\n",
" return tf.nn.weighted_cross_entropy_with_logits(y_true, y_pred, self.pos_weight) * self.weight\n",
" \n",
"model.compile(optimizer=keras.optimizers.RMSprop(learning_rate=1e-3),\n",
" loss=WeightedBinaryCrossEntropy)\n",
"model.compile(optimizer=keras.optimizers.Adam(learning_rate=1e-3),\n",

This comment has been minimized.

Copy link
@yashk2810
@@ -443,7 +443,11 @@
},
"source": [
"class WeightedBinaryCrossEntropy(keras.losses.Loss):\n",
" \n",
" \"\"\"\n",
" args:\n",

This comment has been minimized.

Copy link
@yashk2810

yashk2810 Aug 1, 2019

Member

s/args/Args

Please add all the arguments. Also, indent them by 2 spaces.

Args:
  x:
  y: 

This comment has been minimized.

Copy link
@yashk2810

yashk2810 Aug 1, 2019

Member

Please capitalize the 'a' in args

Slushy and others added some commits Aug 1, 2019

@lamberta

This comment has been minimized.

Copy link
Member

commented Aug 12, 2019

@yashk2810 you good?

@yashk2810

This comment has been minimized.

Copy link
Member

commented Aug 12, 2019

Yup.

@lamberta
Copy link
Member

left a comment

Thanks! Verified CLA

@googlebot

This comment has been minimized.

Copy link

commented Aug 12, 2019

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.
In order to pass this check, please resolve this problem and then comment @googlebot I fixed it.. If the bot doesn't comment, it means it doesn't think anything has changed.

ℹ️ Googlers: Go here for more info.

@googlebot googlebot added cla: no and removed cla: yes labels Aug 12, 2019

@lamberta lamberta added cla: yes and removed cla: no labels Aug 13, 2019

@googlebot

This comment has been minimized.

Copy link

commented Aug 13, 2019

A Googler has manually verified that the CLAs look good.

(Googler, please make sure the reason for overriding the CLA status is clearly documented in these comments.)

ℹ️ Googlers: Go here for more info.

@lamberta

This comment has been minimized.

Copy link
Member

commented Aug 13, 2019

Fixed merge conflict and create a separate TOC section for custom losses

@googlebot

This comment has been minimized.

Copy link

commented Aug 15, 2019

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.
In order to pass this check, please resolve this problem and then comment @googlebot I fixed it.. If the bot doesn't comment, it means it doesn't think anything has changed.

ℹ️ Googlers: Go here for more info.

@googlebot googlebot added cla: no and removed cla: yes labels Aug 15, 2019

@googlebot

This comment has been minimized.

Copy link

commented Aug 15, 2019

A Googler has manually verified that the CLAs look good.

(Googler, please make sure the reason for overriding the CLA status is clearly documented in these comments.)

ℹ️ Googlers: Go here for more info.

@googlebot

This comment has been minimized.

Copy link

commented Aug 15, 2019

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.
In order to pass this check, please resolve this problem and then comment @googlebot I fixed it.. If the bot doesn't comment, it means it doesn't think anything has changed.

ℹ️ Googlers: Go here for more info.

@googlebot googlebot added cla: no and removed cla: yes labels Aug 15, 2019

@googlebot

This comment has been minimized.

Copy link

commented Aug 15, 2019

A Googler has manually verified that the CLAs look good.

(Googler, please make sure the reason for overriding the CLA status is clearly documented in these comments.)

ℹ️ Googlers: Go here for more info.

@TensorFlow-Docs-Copybara TensorFlow-Docs-Copybara merged commit 3c1a656 into tensorflow:master Aug 15, 2019

2 checks passed

cla/google CLAs have been manually verified by Googler who has set the 'cla: yes' label
import/copybara Change imported to the internal review system
Details

TensorFlow-Docs-Copybara pushed a commit that referenced this pull request Aug 15, 2019

Copybara-Service
Merge pull request #892 from esslushy:custom_losses_docs
PiperOrigin-RevId: 263608976
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
7 participants
You can’t perform that action at this time.