diff --git a/site/en/tutorials/reinforcement_learning/actor_critic.ipynb b/site/en/tutorials/reinforcement_learning/actor_critic.ipynb index da74c7e1bf6..bb81769a2e6 100644 --- a/site/en/tutorials/reinforcement_learning/actor_critic.ipynb +++ b/site/en/tutorials/reinforcement_learning/actor_critic.ipynb @@ -505,9 +505,7 @@ "\n", "$$L_{critic} = L_{\\delta}(G, V^{\\pi}_{\\theta})$$\n", "\n", - "where $L_{\\delta}$ is the [Huber loss](https://en.wikipedia.org/wiki/Huber_loss), which is less sensitive to outliers in data than squared-error loss.\n", - "\n", - "\n" + "where $L_{\\delta}$ is the [Huber loss](https://en.wikipedia.org/wiki/Huber_loss), which is less sensitive to outliers in data than squared-error loss.\n" ] }, { diff --git a/site/en/tutorials/structured_data/time_series.ipynb b/site/en/tutorials/structured_data/time_series.ipynb index 73983522d27..a5161791d9f 100644 --- a/site/en/tutorials/structured_data/time_series.ipynb +++ b/site/en/tutorials/structured_data/time_series.ipynb @@ -890,7 +890,7 @@ "source": [ "Typically data in TensorFlow is packed into arrays where the outermost index is across examples (the \"batch\" dimension). The middle indices are the \"time\" or \"space\" (width, height) dimension(s). The innermost indices are the features.\n", "\n", - "The code above took a batch of 2, 7-timestep windows, with 19 features at each time step. It split them into a batch of 6-timestep, 19 feature inputs, and a 1-timestep 1-feature label. The label only has one feature because the `WindowGenerator` was initialized with `label_columns=['T (degC)']`. Initially this tutorial will build models that predict single output labels." + "The code above took a batch of 3, 7-timestep windows, with 19 features at each time step. It split them into a batch of 6-timestep, 19 feature inputs, and a 1-timestep 1-feature label. The label only has one feature because the `WindowGenerator` was initialized with `label_columns=['T (degC)']`. Initially this tutorial will build models that predict single output labels." ] }, { @@ -1197,8 +1197,7 @@ "id": "RKTm8ajVGw4N" }, "source": [ - "The `window` object creates `tf.data.Datasets` from the training, validation, and test sets, allowing you to easily iterate over batches of data.\n", - "\n" + "The `window` object creates `tf.data.Datasets` from the training, validation, and test sets, allowing you to easily iterate over batches of data.\n" ] }, { @@ -1522,7 +1521,7 @@ "id": "X-CGj85oKaOG" }, "source": [ - "Here is the plot of its example predictions on the `wide_widow`, note how in many cases the prediction is clearly better than just returning the input temperature, but in a few cases it's worse:" + "Here is the plot of its example predictions on the `wide_window`, note how in many cases the prediction is clearly better than just returning the input temperature, but in a few cases it's worse:" ] }, { @@ -1645,8 +1644,7 @@ "id": "gtN4BwZ37niR" }, "source": [ - "Note that the `Window`'s `shift` parameter is relative to the end of the two windows.\n", - "\n" + "Note that the `Window`'s `shift` parameter is relative to the end of the two windows.\n" ] }, { @@ -2173,8 +2171,7 @@ "\n", "The models so far all predicted a single output feature, `T (degC)`, for a single time step.\n", "\n", - "All of these models can be converted to predict multiple features just by changing the number of units in the output layer and adjusting the training windows to include all features in the `labels`.\n", - "\n" + "All of these models can be converted to predict multiple features just by changing the number of units in the output layer and adjusting the training windows to include all features in the `labels`.\n" ] }, { @@ -2304,8 +2301,7 @@ "id": "dsc9pur_mHsx" }, "source": [ - "#### RNN\n", - "\n" + "#### RNN\n" ] }, { @@ -2518,8 +2514,7 @@ "1. Single shot predictions where the entire time series is predicted at once.\n", "2. Autoregressive predictions where the model only makes single step predictions and its output is fed back as its input.\n", "\n", - "In this section all the models will predict **all the features across all output time steps**.\n", - "\n" + "In this section all the models will predict **all the features across all output time steps**.\n" ] }, { @@ -2861,9 +2856,7 @@ "\n", "You could take any of single single-step multi-output models trained in the first half of this tutorial and run in an autoregressive feedback loop, but here we'll focus on building a model that's been explicitly trained to do that.\n", "\n", - "![Feedback a model's output to its input](images/multistep_autoregressive.png)\n", - "\n", - "\n" + "![Feedback a model's output to its input](images/multistep_autoregressive.png)\n" ] }, {