From c0d4f1912955d1d1de06afab9226f97de82386cd Mon Sep 17 00:00:00 2001 From: Jirka Borovec Date: Mon, 21 Dec 2020 00:03:48 +0100 Subject: [PATCH] format --- docs/source/info_callbacks.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/source/info_callbacks.rst b/docs/source/info_callbacks.rst index 224c668a14..5dcc998f7b 100644 --- a/docs/source/info_callbacks.rst +++ b/docs/source/info_callbacks.rst @@ -104,7 +104,7 @@ does all of that for you before training begins. This Callback will warn the user with the following message in case data mixing inside the batch is detected: -.. code-block:: bash +.. code-block:: Your model is mixing data across the batch dimension. This can lead to wrong gradient updates in the optimizer. @@ -123,4 +123,4 @@ that works with any PyTorch :class:`~torch.nn.Module` is also available: verification = BatchGradientVerification(model) valid = verification.check(input_array=torch.rand(2, 3, 4), sample_idx=1) -In this example we run the test on a batch size 2 by inspecting gradients on the second sample. \ No newline at end of file +In this example we run the test on a batch size 2 by inspecting gradients on the second sample.