Skip to content

Commit

Permalink
Fix minor typos in char rnn notebooks
Browse files Browse the repository at this point in the history
  • Loading branch information
cvanderw committed Feb 17, 2019
1 parent 4af6b16 commit 7635595
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
Expand Up @@ -324,7 +324,7 @@
"In `__init__` the suggested structure is as follows:\n",
"* Create and store the necessary dictionaries (this has been done for you)\n",
"* Define an LSTM layer that takes as params: an input size (the number of characters), a hidden layer size `n_hidden`, a number of layers `n_layers`, a dropout probability `drop_prob`, and a batch_first boolean (True, since we are batching)\n",
"* Define a dropout layer with `dropout_prob`\n",
"* Define a dropout layer with `drop_prob`\n",
"* Define a fully-connected layer with params: input size `n_hidden` and output size (the number of characters)\n",
"* Finally, initialize the weights (again, this has been given)\n",
"\n",
Expand Down Expand Up @@ -557,7 +557,7 @@
},
"outputs": [],
"source": [
"## TODO: set you model hyperparameters\n",
"## TODO: set your model hyperparameters\n",
"# define and print the net\n",
"n_hidden=\n",
"n_layers=\n",
Expand Down
Expand Up @@ -383,7 +383,7 @@
"In `__init__` the suggested structure is as follows:\n",
"* Create and store the necessary dictionaries (this has been done for you)\n",
"* Define an LSTM layer that takes as params: an input size (the number of characters), a hidden layer size `n_hidden`, a number of layers `n_layers`, a dropout probability `drop_prob`, and a batch_first boolean (True, since we are batching)\n",
"* Define a dropout layer with `dropout_prob`\n",
"* Define a dropout layer with `drop_prob`\n",
"* Define a fully-connected layer with params: input size `n_hidden` and output size (the number of characters)\n",
"* Finally, initialize the weights (again, this has been given)\n",
"\n",
Expand Down

0 comments on commit 7635595

Please sign in to comment.